`
`UNITED STATES DEPARTMENT OF COMMERCE
`United States Patent and Trademark Office
`Address: COMMISSIONER FOR PATENTS
`P.O. Box 1450
`Alexandria, Virginia 22313-1450
`
`18/041,704
`
`02/15/2023
`
`YASUAKI TAKAHASHI
`
`SYP335560US01
`
`8314
`
`CHIP LAW GROUP
`505 N. LAKE SHORE DRIVE
`SUITE 250
`CHICAGO, IL 60611
`
`CATTUNGAL, ROWINAI
`
`2425
`
`PAPER NUMBER
`
`NOTIFICATION DATE
`
`DELIVERY MODE
`
`11/21/2024
`
`ELECTRONIC
`
`Please find below and/or attached an Office communication concerning this application or proceeding.
`
`The time period for reply, if any, is set in the attached communication.
`
`Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the
`following e-mail address(es):
`
`docketing @chiplawgroup.com
`eofficeaction @appcoll.com
`sonydocket @evalueserve.com
`
`PTOL-90A (Rev. 04/07)
`
`
`
`Application No.
`Applicant(s)
`18/041,704
`TAKAHASH|etal.
`
`Office Action Summary Art Unit|AIA (FITF)StatusExaminer
`ROWINA J CATTUNGAL
`2425
`Yes
`
`
`
`-- The MAILING DATEof this communication appears on the cover sheet with the correspondence address --
`Period for Reply
`
`A SHORTENED STATUTORYPERIOD FOR REPLYIS SET TO EXPIRE 3 MONTHS FROM THE MAILING
`DATE OF THIS COMMUNICATION.
`Extensionsof time may be available underthe provisions of 37 CFR 1.136(a). In no event, however, may a reply betimely filed after SIX (6) MONTHSfrom the mailing
`date of this communication.
`If NO period for reply is specified above, the maximum statutory period will apply and will expire SIX (6) MONTHSfrom the mailing date of this communication.
`-
`- Failure to reply within the set or extended period for reply will, by statute, cause the application to become ABANDONED (35 U.S.C. § 133).
`Any reply received by the Office later than three months after the mailing date of this communication, evenif timely filed, may reduce any earned patent term
`adjustment. See 37 CFR 1.704(b).
`
`Status
`
`
`
`1) Responsive to communication(s) filed on 09/26/2024.
`C} A declaration(s)/affidavit(s) under 37 CFR 1.130(b) was/werefiled on
`2a)[¥) This action is FINAL.
`2b) (J This action is non-final.
`3) An election was madeby the applicant in responseto a restriction requirement set forth during the interview
`on
`; the restriction requirement and election have been incorporated into this action.
`4)(2) Since this application is in condition for allowance except for formal matters, prosecution as to the merits is
`closed in accordance with the practice under Exparte Quayle, 1935 C.D. 11, 453 O.G. 213.
`
`Disposition of Claims*
`1-11 is/are pending in the application.
`)
`Claim(s)
`5a) Of the above claim(s) _ is/are withdrawn from consideration.
`C} Claim(s)__ is/are allowed.
`Claim(s) 1-11 is/are rejected.
`(] Claim(s)__ is/are objectedto.
`C] Claim(s
`are subjectto restriction and/or election requirement
`)
`* If any claims have been determined allowable, you maybeeligible to benefit from the Patent Prosecution Highway program at a
`participating intellectual property office for the corresponding application. For more information, please see
`http://www.uspto.gov/patents/init_events/pph/index.jsp or send an inquiry to PPHfeedback@uspto.gov.
`
`) ) ) )
`
`Application Papers
`10)( The specification is objected to by the Examiner.
`11) The drawing(s) filed on 02/15/2023 is/are: a)[¥) accepted or b)(.) objected to by the Examiner.
`Applicant may not request that any objection to the drawing(s) be held in abeyance. See 37 CFR 1.85(a).
`Replacement drawing sheet(s) including the correction is required if the drawing(s) is objected to. See 37 CFR 1.121(d).
`
`Priority under 35 U.S.C. § 119
`12)(¥) Acknowledgment is made of a claim for foreign priority under 35 U.S.C. § 119(a)-(d)or (f).
`Certified copies:
`_—_c)L) None ofthe:
`b)L) Some**
`a)Y) All
`1.) Certified copies of the priority documents have been received.
`2.1.) Certified copies of the priority documents have been received in Application No. |
`3.2.) Copies of the certified copies of the priority documents have been receivedin this National Stage
`application from the International Bureau (PCT Rule 17.2(a)).
`*“ See the attached detailed Office action for a list of the certified copies not received.
`
`Attachment(s)
`
`1)
`
`Notice of References Cited (PTO-892)
`
`2) (J Information Disclosure Statement(s) (PTO/SB/08a and/or PTO/SB/08b)
`Paper No(s)/Mail Date
`U.S. Patent and Trademark Office
`
`3)
`
`4)
`
`(LJ Interview Summary (PTO-413)
`Paper No(s)/Mail Date
`(Qj Other:
`
`PTOL-326 (Rev. 11-13)
`
`Office Action Summary
`
`Part of Paper No./Mail Date 20241115
`
`
`
`Application/Control Number: 18/041,704
`Art Unit: 2425
`
`Page 2
`
`Notice of Pre-AlA or AIA Status
`
`1.
`
`The present application, filed on or after March 16, 2013,
`
`is being examined under
`
`the first inventor to file provisions of the AIA.
`
`2.
`
`This office action is in response to amendmentfiled 09/26/2024 in which the claims
`
`1-11 are pending.
`
`Response to Arguments
`
`3.
`
`Applicant's arguments filed 09/26/2024 have been fully considered but they are
`
`not persuasive.
`
`The Applicant argues that the combination of McDowall and Stern does not teach,
`
`suggest, or render obviousat least, for example, the features of "a signal processing unit
`
`configured to perform a depth-of-field extension process to generate an extended-depth-
`
`of-field image ... the extended-depth-of-field image is generated based on an extension
`
`of adepth of field using at least two of the first image signal, the second image signal, or
`
`the third image signal," as recited in amended independent claim 1.
`
`It was alleged in the Office Action that:
`
`Regarding claim 1
`
`... McDowall does not explicitly disclose a third imaging element that
`
`receivesat least light in the second wavelength band from the incident light and outputs
`
`a third image signal; and a signal processing unit that performs depth-of-field extension
`
`processing to generate an extended-depth-of-field image obtained by extending a depth
`
`of field using at least two of the first image signal, the second imagesignal, and thethird
`
`image signal; and shorter than an optical path length from the mount surface to the third
`
`imaging element.
`
`See Office Action at pages 3 and 5 (emphasis added).
`
`
`
`Application/Control Number: 18/041,704
`Art Unit: 2425
`
`Page 3
`
`The Office relied on Stern to remedy the above-noted deficiency of the McDowall.
`
`It was alleged in the Office Action that:
`
`Stern discloses a third imaging element that receives at second wavelength band
`
`from the incident light and outputs a third imagesignal (para [0086] & Fig. 6 teaches the
`
`image capture device 600 includes an optical element 610, an optical device 620, a first
`
`sensor 630A, asecond sensor 630B, a third sensor 630C, and an image processor 650,
`
`Para [0089] & Fig. 6 teaches the second sensor 630B may be a Bayer sensor, a
`
`monochromatic sensor, or any other type of image sensor. The third sensor 6300 may be
`
`a Bayer sensor, a monochromatic sensor, or any other type of image sensor. The typeof
`
`sensors may be varied based on lighting conditions); and a signal processing unit that
`
`performs depth-of-field extension processing to generate an extended-depth-of-field
`
`image obtained by extending a depth of field using at least two of the first image signal,
`
`the second image signal, and the third image signal
`
`... (para[0091] teaches the depth of
`
`field of the first sensor 630 may overlap with one or more of the depth of fields of the
`
`second sensor 630B and the third sensor 630C. The distance D1, the distance D2, the
`
`distance D3, or any combination thereof, may each be adjusted to achieve any desired
`
`depth of field. In some embodiments optical path length of the first ray path 660A may be
`
`greater than the optical path length of the second ray path 660B, the optical path length
`
`of the third ray path 660C, or both.
`
`In some embodiments, the optical path length of the
`
`second ray path 660B may be greater than the optical path length of the first ray path
`
`660A, the optical path length of the third ray path 660C, or both.
`
`In some embodiments,
`
`the optical path length of the first ray path 660A, the optical path length of the second ray
`
`path 660B, and the optical path length of the third ray path 6600 mayall be different).
`
`
`
`Application/Control Number: 18/041,704
`Art Unit: 2425
`
`See Office Action at pages 3, 5 and 6 (emphasis added).
`
`Page 4
`
`Stern describes "the image capture device 600 includes an optical element 610,
`
`an optical device 620, a first sensor 630A, a second sensor 630B, a third sensor 630C,
`
`and an image processor 650." See Stern at [0086]. Stern further describes "[t]he
`
`depth of field of the first sensor 630A is based onthefirst ray path 660A. Thefirst ray
`
`path 660A has an optical path length that may be configured by adjusting the distance D1
`
`betweenthefirst sensor 630A and the optical device 620. Accordingly, the depth offield
`
`of the first sensor 630A may be configured by increasing or decreasing the optical path
`
`length of the first ray path 660A. The depth offield of the second sensor 630B is based
`
`on the second ray path 660B. The second ray path 660B has an optical path length that
`
`may be configured by adjusting the distance D2 between the second sensor 630B and
`
`the optical device 620. Accordingly, the depth of field of the second sensor 630B may be
`
`configured by increasing or decreasing the optical path length of the second ray path
`
`660B. The depth offield of the third sensor 630C is based onthethird ray path 660C. The
`
`third ray path 660C has an optical path length that may be configured by adjusting the
`
`distance D3 between thethird sensor 630C and the optical device 620. Accordingly, the
`
`depth of field of the third sensor 630C may be configured by increasing or decreasing the
`
`optical path length of the third ray path 660C. The depth of field of the first sensor 630
`
`may overlap with one or moreof the depth of fields of the second sensor 630B and the
`
`third sensor 630C. The distance D1, the distance D2, the distance D3, or any combination
`
`thereof, may each be adjusted to achieve any desired depth of field." See Stern at [0091].
`
`Stern describes the image capture device 600 including the optical device 620, thefirst
`
`sensor 630A, the second sensor 630B, and the third sensor 630C. Stern further describes
`
`
`
`Application/Control Number: 18/041,704
`Art Unit: 2425
`
`Page 5
`
`that the depth of field of the first sensor 630A is configured by increasing or decreasing
`
`the optical path length of the first ray path 660A, the depth of field of the second sensor
`
`630B is configured by increasing or decreasing the optical path length of the second ray
`
`path 660B, and the depth offield of the third sensor 630C is configured by increasing or
`
`decreasing the optical path length of the third ray path 660C. However, Sternin its entirety
`
`does not describe that an extended-depth- of-field image is generated by an extension of
`
`a depth of field based on at least_two of a first image signal by the first sensor 630A, a
`
`second image signal by the second sensor 630B, or a third image signal by the third
`
`sensor 6300.
`
`Therefore, the Applicant submits that the combination of McDowall and Stern does not
`
`teach, suggest, or render obvious at
`
`least,
`
`for example,
`
`the features of "a signal
`
`processing unit configured to perform a depth-of-field extension process to generate an
`
`extended-depth-of-field image ... the extended-depth-of-field image is generated based
`
`on an extension of a depth of field using at least two of thefirst image signal, the second
`
`image signal, or the third image signal," as recited in amended independent claim 1.
`
`Therefore, amended independent claim 1
`
`is not taught, suggested, or rendered obvious
`
`over the combination of McDowall and Stern. The Applicant submits that amended
`
`independent claims 10 and 11 are also not taught, suggested or rendered obvious over
`
`the combination of McDowall and Stern at least for the reasons stated above with regard
`
`to amended independent claim 1.
`
`Examiner respectfully disagrees and clarifies as a McDowall et al. & Stern etal.
`
`clearly discloses the amended features of claim 1.
`
`
`
`Application/Control Number: 18/041,704
`Art Unit: 2425
`
`Page 6
`
`McDowall et al. discloses signal processing unit configured to perform a depth-of-
`
`field extension process to generate an extended-depth-of-field image, wherein the
`
`extended-depth-of-field_ image is generated based _on an extension of a depth offield
`
`
`
`using at least two of the first image signal, the second image signal—erthethirdimage
`
`signat“as recited in amended independent claim 1.
`
`McDowall et al is image capture unit and method with an extended depth offield.
`
`For example, Fig. 8D is anillustration of one combination of the two images of FIGS. 8B
`
`and 8C combined to form an in-focus image with extended depth of field. para [0280] —
`
`[0282] teaches Extended Depth of Field dual image enhancement module 240R (240L)
`
`generates an in focus extended depth of field image from the two images captured by
`
`image capture sensors 810R (810L) and 815R (815L). Para [0298] teaches Dual image
`
`enhancement module 240R (240L) generates an in focus extended depth offield image
`
`from the two images captured by image capture sensors 810R (810L) and 815R (815L).
`
`Stern discloses signal processing unit configured to perform a depth-of-field
`
`extension process to generate an extended-depth-of-field image, wherein the extended-
`
`depth-of-field image is generated based on an extension of a depth offield using at least
`
`two of the first image signal, the second image signal, or the third image signal," as recited
`
`in amended independentclaim 1.
`
`Stern et al.
`
`is an Image capture device with Extended depth of field. Abstract
`
`teaches image processor may be configured to obtain a focused image based ona first
`
`image and a second image. The focused image may have an extended depth offield.
`
`The extended depth of field may be based_on the depth of field of each respective optical
`
`element.
`
`
`
`Application/Control Number: 18/041,704
`Art Unit: 2425
`
`Page 7
`
`Para [0077] & Fig. 4 for example image processor 450 is configured to receive the
`
`image from the first sensor 430 and the second sensor 440. The image processor 450is
`
`configured to combine the image from thefirst sensor 430 and the image from the second
`
`sensor 440 to obtain a focused image. The focused image has an extended depth offield
`
`that_is based _on thefirst optical path length and the second optical path length.
`
`In an
`
`example wherethe first sensor 430 has a depth offield that is from about 0.6 m toinfinity
`
`(i.e., greater than 10 m), and the second sensor 440 has a depth of field that is from about
`
`0.4 m to about 1.0 m, the extended depth of field is from about _0.4 m toinfinity.
`
`Para [0084] & Fig. 5 teaches the image processor 550 is configured to receive the
`
`image from the first sensor 530 and the second sensor 540. The image processor 550is
`
`configured to combine the image from thefirst sensor 530 and the image from the second
`
`sensor 540 to obtain a focused image. The focused image has an extended depthoffield
`
`that_is based_on thefirst optical path length and the second optical path length.
`
`In an
`
`example wherethe first sensor 530 has a depth offield that is from about 0.6 m toinfinity
`
`(i.e., greater than 10 m), and the second sensor 540 has a depth of field that is from about
`
`0.4 m to about 1.0 m, the extended depth of field is from about 0.4 m toinfinity. & p
`
`Thus,
`
`the above two citations teach the amended claim limitation, wherein the
`
`extended-depth-of-field_ image is generated based on an extension of a depth offield
`
`
`
`using at least two of the first image signal, the second image signal, erthe-+thircdimage
`
`signak
`
`Also, Para [0092] & Fig. 6 as cited earlier also teaches amended independent
`
`claim 1, since it includes the third image signal from third sensor 630C. Para[0092]
`
`teaches the focused image has an extended depth of field that is based on thefirst optical
`
`
`
`Application/Control Number: 18/041,704
`Art Unit: 2425
`
`Page 8
`
`path length,
`
`the second optical path length, and the third optical path length.
`
`In an
`
`example wherethefirst sensor 630A has a depth of field that is from about 0.6 m to infinity
`
`(i.e., greater than 10 m), the second sensor 630B has a depth offield that is from about
`
`0.4 m to about 1.0 m, and the third sensor 630C has a depth of field that is from about
`
`0.1. to about 0.5 m, the extended depth offield is from about 0.1 m to infinity.
`
`Thus, amended independent claim 1
`
`is obvious over the combination of McDowall
`
`and Stern. Amended independent claims 10 and 11 are also rendered obvious over the
`
`combination of McDowall and Stern at least for the reasons stated above with regard to
`
`response to amended independent claim 1
`
`Claim Rejections - 35 USC § 103
`
`4.
`
`In the event the determination of the status of the application as subject to AIA 35
`
`U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any
`
`correction of the statutory basis (i.e., changing from AIA to pre-AlA) for the rejection will
`
`not be considered a new ground ofrejection if the prior art relied upon, and the rationale
`
`supporting the rejection, would be the same undereither status.
`
`5.
`
`The following is a quotation of 35 U.S.C. 103 which forms the basis for all
`
`obviousness rejections set forth in this Office action:
`
`A patent for a claimed invention may not be obtained, notwithstanding that
`
`the claimed invention is not identically disclosed as set forth in section 102,
`
`if the differences between the claimed invention and the prior art are such
`
`that the claimed invention as a whole would have been obvious before the
`
`effectivefiling date of the claimed invention to a person having ordinary skill
`
`
`
`Application/Control Number: 18/041,704
`Art Unit: 2425
`
`Page 9
`
`in the art to which the claimed invention pertains. Patentability shall not be
`
`negated by the mannerin which the invention was made.
`
`6.
`
`The factual inquiries for establishing a background for determining obviousness
`
`under 35 U.S.C. 103 are summarized asfollows:
`
`1. Determining the scope and contents of the priorart.
`
`2. Ascertaining the differences between the prior art and the claims at issue.
`
`3. Resolving the level of ordinary skill in the pertinent art.
`
`4. Considering
`
`objective
`
`evidence present
`
`in
`
`the
`
`application
`
`indicating
`
`obviousness or nonobviousness.
`
`7.
`
`Claims 1-4, 6-11 are rejected under 35 U.S.C. 103 as being unpatentable over
`
`McDowall et al. (US 2013/0038689 A1) in view of Stern et al. (US 2021/0037187 A1).
`
`Regarding claim 1, McDowall discloses a medical imaging system (Fig. 2 Abstract
`
`teaches surgical system for observing enhanced images) comprising: a first imaging
`
`element configured to: receivelight in a first wavelength band from light incident from a
`
`mount surface; and output a first image signal (fig. 3A teaches first image capture
`
`sensor 310L & 310R & Fig. 8A, 810R, 815R, 810L, 815L); a second imaging element
`
`configured to:
`
`receive light
`
`in a second wavelength band, different from the first
`
`wavelength band, from the incident
`
`light; and output a second image signal
`
`(fig. 3A
`
`teaches second image capture sensor 315L & 315R, para[0237] teaches the notch
`
`filter for the blue wavelengths is configured to reflect color component B1 and to
`
`pass other blue wavelengths so that the fluorescence from the Fluorescein is
`
`passed to image capture sensor 715R (715L). The notchfilters and the illumination
`
`
`
`Application/Control Number: 18/041,704
`Art Unit: 2425
`
`Page 10
`
`components from the illuminator for the red and green color components are
`
`selected so that the reflected color components R1, B1, and G1 are captured by
`
`first image capture sensor 710R (710L). The remaining colors are delivered to
`
`second image capture sensor 715R (715L). Note that
`
`the excitation laser
`
`component B1 may saturate the blue pixels of first image capture sensor 710R
`
`(710L), but this is acceptable as the fluorescence from the Fluorescein is captured
`
`by second image capture sensor 715R (715L)); and a signal processing unit configured
`
`to_perform_a_depth-of-field extension process to generate an extended-depth-of-field
`
`image, wherein the extended-depth-of-field_ image is generated based on an extension of
`
`a depth of field using at
`
`least two of the first image signal,
`
`the second image signal
`
`(para[0057] & Fig. 8D & Para[0282] teaches generating composite image whose
`
`depth offield is better than each of the constituent images), an optical path length
`
`from the mount surfaceto thefirst imaging element is longer than an optical path length
`
`from the mount surface to the second imaging element (Para[0021] Para[0036], [0055]
`
`& Fig. 8A teaches optical path lengths to the two image sensors are different & Fig.
`
`6A first imaging sensor 610L and third image sensor 615R having different optical
`
`path length wherein third imaging sensor 615 R is different from second imaging
`
`sensor 615L & Para[0291] — [0295} teaches the light going to image capture sensor
`
`810R (810L) is captured asafirst image focusedata first object distance DA from
`
`image capture unit 825R (825L). See for example FIG. 8B. The light going to image
`
`capture sensor 815R (815L) takes a slightly longer path to arrives sensor at 815R
`
`(815L) and is captured as a second image focused at a second object distance DB
`
`from the image capture unit. See for example, FIG. 8C. Distance DB is larger than
`
`
`
`Application/Control Number: 18/041,704
`Art Unit: 2425
`
`Page 11
`
`distance DA. The difference in the path lengths is based on the front end optical
`
`design, e.g., focal length of lens element 804R (804L), and the distance for the best
`
`focus of each of the two captured images).
`
`McDowall does not explicitly disclose a third imaging element configured_to:
`
`
`receive at least light in the second wavelength band from theincident light; and output a
`
`third image signal; and a signal processing unit configured to perform a depth-of-field
`
`extension process to generate an extended-depth-of-field image, wherein the extended-
`
`depth-of-field image is generated based on an extension of a depth of field using at least
`
`two of the first image signal, the second image signal or the third image signal; and the
`
`optical path length from the mount surfaceto the first imaging element is shorter than an
`
`optical path length from the mount surfaceto the third imaging element. However, Stern
`
`discloses_configured to: receive at least
`
`light
`
`in the second wavelength band from the
`
`incident light; and output a third image signal; (para [0086] & Fig. 6 teaches the image
`
`capture device 600 includes an optical element 610, an optical device 620, a first
`
`sensor 630A, a second sensor 630B, a third sensor 630C, and an image
`
`processor 650, Para [0089] & Fig. 6 teaches the second sensor 630B may be a
`
`Bayer sensor, a monochromatic sensor, or any other type of image sensor. The
`
`third sensor 630C may be a Bayer sensor, a monochromatic sensor, or any other
`
`type of image sensor. The type of sensors may be varied based on lighting
`
`conditions); and a signal processing unit configured to perform a depth-of-field extension
`
`process to generate an extended-depth-of-field image, wherein the extended-depth-of-
`
`field image is generated based on an extension of a depth of field using at least two of
`
`the first image signal, the second image signal or the third image signal
`
`( Para [0077] &
`
`
`
`Application/Control Number: 18/041,704
`Art Unit: 2425
`
`Page 12
`
`Fig. 4 for example image processor 450 is configured to receive the image from the
`
`
`
`first sensor 430 and the second sensor 440. The image_processor 450 is
`
`configured to combine the image from the first sensor 430 and the image from the
`
`second sensor 440 to obtain a focused image. The focused image has an extended
`
`depth offield that is based on thefirst optical path length and the second optical
`
`path length. In an example where the first sensor 430 has a depth offield thatis
`
`from about0.6 m to infinity (i.e., greater than 10 m), and the second sensor 440 has
`
`a depth offield that is from about 0.4 m to about 1.0 m, the extended depth offield
`
`is from about 0.4 m to infinity. Para [0084] & Fig. 5 teaches the image
`
`processor 550 is configured to receive the image from thefirst sensor 530 and the
`
`second sensor 540. The image processor 550 is configured to combine the image
`
`from the first sensor 530 and the image from the second sensor 540 to obtain a
`
`focused image. The focused image has an extended depthoffield that is based on
`
`the first optical path length and the second optical path length. In an example where
`
`the first sensor 530 has a depth of field that is from about 0.6 m to infinity (i.e.,
`
`greater than 10 m), and the second sensor 540 has a depth offield that is from
`
`about 0.4 m to about 1.0 m, the extended depth offield is from about 0.4 m to
`
`infinity. Para [0092] teaches the focused image has an extended depth offield that
`
`is based on thefirst optical path length, the second optical path length, and the
`
`third optical path length. In an example wherethe first sensor 630A has a depth of
`
`field that is from about 0.6 m to infinity (i.e., greater than 10 m),
`
`the second
`
`sensor 630B has a depth offield that is from about 0.4 m to about 1.0 m, and the
`
`third sensor 630C has a depth offield that is from about 0.1. to about 0.5 m, the
`
`
`
`Application/Control Number: 18/041,704
`Art Unit: 2425
`
`Page 13
`
`extended depth of field is from about 0.1 m to infinity), and the optical path length
`
`from the mount surface to the first imaging element is shorter than an optical path length
`
`from the mount surfaceto the third imaging element (para [0091] teaches the depth of
`
`field of the first sensor 630 may overlap with one or more of the depth offields of
`
`the second sensor 630B andthe third sensor 630C. The distance D1, the distance
`
`D2, the distance D3, or any combination thereof, may each be adjusted to achieve
`
`any desired depth offield. In some embodiments optical path length of the first ray
`
`path 660A may be greater than the optical path length of the second ray path 660B,
`
`the optical path length of the third ray path 660C, or both. In some embodiments,
`
`the optical path length of the second ray path 660B may be greater than the optical
`
`path length of the first ray path 660A,
`
`the optical path length of the third ray
`
`path 660C, or both. In some embodiments,
`
`the optical path length of the first ray
`
`path 660A, the optical path length of the second ray path 660B, and the optical path
`
`length of the third ray path 660C mayall be different).
`
`\t would have been obviousto
`
`one having ordinary skill
`
`in the art before the effectivefiling date of the invention to use
`
`the method of achieved effectively extended depth of field by having the optical path
`
`length from the distal face to the first sensor surface of McDowall with the method of
`
`having lenses have offset focal distances and resulting images combined using focused
`
`stacking computational technique to create enhanced depth of field image or video stream
`
`of Stern in order to provide a system with increased depth of field allow lenses to have
`
`larger aperture and to increase the total focus range of the image capture device
`
`
`
`Application/Control Number: 18/041,704
`Art Unit: 2425
`
`Page 14
`
`Regarding claim 2, McDowall
`
`in view of Stern discloses the medical
`
`imaging
`
`system according to claim 1, McDowall further discloses wherein the second wavelength
`
`band includes a visible wavelength (para[0260] teaches sensor assembly 720R (720L)
`
`also enables imaging fluorescence in real time at nearly any wavelength in the
`
`visible or near infrared, and enables imaging natural tissue fluorescence);
`
`the
`
`signal processing unit is further configured to generate the extended-depth-of-field image
`
`based on a selection of a pixel value having a higher contrast out of pixel values of pixels
`
`at a same pixel position_of the at least_two of the first image signal,
`
`the second image
`
`signal, or the third image signal. (Para [0207]- [0210] teaches pixels that are each
`
`based on two pixels for a point in space have reduced noise level for mid
`
`brightness regions and the dynamic range is extended. Brightness values from 0
`
`to 400 are mapped to the dynamic output range (contrast ratio) of stereoscopic
`
`display 251 by the tone mapping. For pixels that are not saturated, the brightness
`
`value in the image captured by sensor 615R (615L) is three times the brightness
`
`value in the image captured by sensor 610R (610L) for the same point in space. The
`
`tone mapping processutilizes this relationship in determining the brightness for a
`
`pixel in an image output to stereoscopic display 251. Based on information on the
`
`surrounding pixels, tone mapping works on individual groups/regions of pixels to
`
`maintain the contrast and thus reduces the noise level and the dynamic range).
`
`Regarding claim 3, Stern disclose the medical imaging system according to claim
`
`1, further comprising an optical splitting system configured to split the incident light toward
`
`each of the first imaging element, the second imaging element, and the third imaging
`
`
`
`Application/Control Number: 18/041,704
`Art Unit: 2425
`
`Page 15
`
`element (para [0088] The optical device 620 is configured to receive an image via
`
`the optical element 610. The optical device 620 may be any type of device
`
`configured to direct
`
`light onto one or more image sensors.
`
`In some
`
`implementations,
`
`the optical device may be a beam splitter or a lenticular lens. As
`
`shown in FIG. 6,
`
`the optical device 620 is shown as a beam splitter that is
`
`configured to refract the received light and create a first ray path 660A, a second
`
`ray path 660B, and a third ray path 660C directed to the first sensor 630A,
`
`the
`
`second sensor 630B, and the third sensor 630C, respectively), wherein the optical
`
`splitting system is set such that an amountoflight incident on the third imaging element
`
`is greater than an amountof light incident on the second imaging element (para[0089]
`
`teaches the In an example where the first sensor 630A or the second sensor 630B
`
`is a Bayer sensor and the third sensor 630C is a monochromatic sensor,
`
`the
`
`amountof light required for image capture may be decreased bya factor of 3 or
`
`more). Motivation to combine asindicated in claim 1.
`
`Regarding claim 4, McDowall discloses the medical imaging system according to
`
`claim 1, wherein thefirst imaging element is further configured to receive near infrared
`
`(NIR) light (para [00260] teaches sensor assembly 720R (720L) also enables imaging
`
`fluorescence in real time at nearly any wavelength in the visible or near infrared,
`
`and enables imaging natural tissue fluorescence).
`
`Regarding claim 6, McDowall
`
`in view of Stern discloses the medical
`
`imaging
`
`system according to claim 1, McDowall further discloses wherein each of the second
`
`
`
`Application/Control Number: 18/041,704
`Art Unit: 2425
`
`Page 16
`
`imaging elementincludes a colorfilter, the first imaging element includes no colorfilter
`
`(para[0033] teaches first image capture sensor is a monochrome image capture
`
`sensor, and the second image capture sensor is an image capture sensor having a
`
`color filter array for the other color components in the plurality of color
`
`components, para[0180] teaches colorfilter array is removed from one of the color
`
`sensors and the sensor functions as a monochrome sensor).
`
`McDowall does not explicitly disclose wherein each of the second imaging element
`
`and the third imaging element
`
`includes a colorfilter, and the signal processing unit is
`
`further configured to perform the depth-of-field extension process based on the second
`
`image signal and the third image signal. However, Stern discloses wherein each of the
`
`second imaging element and the third imaging element includes a colorfilter (para [0089]
`
`& Fig. 6 teaches the first sensor 630A may be a Bayer sensor, a monochromatic
`
`sensor, or any other type of image sensor. The second sensor 630B may be a Bayer
`
`sensor, a monochromatic sensor, or any other type of image sensor. The third
`
`sensor 630C may be a Bayer sensor, a monochromatic sensor, or any other type
`
`of image sensor. The type of sensors may be varied based on lighting conditions)
`
`and the signal processing unit is further configured to perform the depth-of-field extension
`
`process based _on the second image signal and the third image signal (para[0087] & Fig.
`
`6 teaches the optical element 610 may be any type of lens or element that is
`
`configured to receive and direct light onto one or more image sensors, for example
`
`the first sensor 630A,
`
`the second sensor 630B,
`
`the third sensor 630C, or any
`
`combination thereof. T