(19)
(11)EP 2 535 923 B1

(12)EUROPEAN PATENT SPECIFICATION

(45)Mention of the grant of the patent:
17.06.2020 Bulletin 2020/25

(21)Application number: 11739588.9

(22)Date of filing:  08.02.2011
(51)International Patent Classification (IPC): 
G01B 11/02(2006.01)
H01L 21/68(2006.01)
H01L 21/66(2006.01)
H01L 21/67(2006.01)
(86)International application number:
PCT/JP2011/000700
(87)International publication number:
WO 2011/096239 (11.08.2011 Gazette  2011/32)

(54)

DETECTION METHOD AND DETECTION DEVICE

ERKENNUNGSVERFAHREN UND ERKENNUNGSVORRICHTUNG

PROCÉDÉ DE DÉTECTION ET DISPOSITIF DE DÉTECTION


(84)Designated Contracting States:
AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

(30)Priority: 08.02.2010 JP 2010025935

(43)Date of publication of application:
19.12.2012 Bulletin 2012/51

(73)Proprietor: Nikon Corporation
Tokyo 108-6290 (JP)

(72)Inventors:
  • KITO, Yoshiaki
    Tokyo 100-8331 (JP)
  • ARAI, Masanori
    Tokyo 100-8331 (JP)
  • FUKUI, Tatsuo
    Tokyo 100-8331 (JP)

(74)Representative: Hoffmann Eitle 
Patent- und Rechtsanwälte PartmbB Arabellastraße 30
81925 München
81925 München (DE)


(56)References cited: : 
EP-A2- 1 178 521
WO-A1-2008/153086
JP-A- 2001 006 998
JP-A- 2006 220 425
US-A1- 2002 130 785
WO-A1-98/57361
JP-A- 5 332 719
JP-A- 2002 050 749
JP-A- 2009 139 285
  
      
    Note: Within nine months from the publication of the mention of the grant of the European patent, any person may give notice to the European Patent Office of opposition to the European patent granted. Notice of opposition shall be filed in a written reasoned statement. It shall not be deemed to have been filed until the opposition fee has been paid. (Art. 99(1) European Patent Convention).


    Description

    TECHNICAL FIELD



    [0001] The present invention relates to a detection method and a detection apparatus.

    BACKGROUND ART



    [0002] Of a particular interest is a layered semiconductor device in which a plurality of substrates with electronic circuits formed thereon are stacked on each other in order to provide increased mount density for the semiconductor device. To stack a plurality of substrates on each other, a substrate bonding apparatus may be used to align and bond the substrates (see, for example, Japanese Patent Application Publication No. 2009-231671). Documents cited during prosecution include EP1178521 A2, which discloses methods and apparatus for processing composite members, and US 2002/0130785 A1, which discloses a semiconductor wafer imaging system.

    DISCLOSURE OF THE INVENTION


    PROBLEMS TO BE SOLVED BY THE INVENTION



    [0003] To stack a plurality of substrates on each other, the substrates may be appropriately positioned by referring to the outlines of the respective substrates. In this case, the outlines are detected by means of a transmissive optical system. However, when such an optical system is used to detect the outline of a layered substrate having a plurality of substrates stacked on each other and, in particular, the upper substrate is smaller in outline than the lower substrate, it is difficult to detect an accurate outline of the upper substrate.

    MEANS FOR SOLVING THE PROBLEMS



    [0004] A first aspect of the innovations may include a detection method of detecting a position of a particular one of a plurality of substrates stacked on each other as recited in Claim 1 below.

    [0005] A second aspect of the innovations may include a detection apparatus for detecting a position of a particular one of a plurality of substrates stacked on each other as recited in Claim 19 below. The dependent claims define particular embodiments of each respective aspect. The above and other features and advantages of the present invention will become more apparent from the following description of the embodiments taken in conjunction with the accompanying drawings.

    BRIEF DESCRIPTION OF THE DRAWINGS



    [0006] 

    Fig. 1 is a perspective view schematically illustrating the structure of a detection apparatus 100.

    Fig. 2 is an explanatory view illustrating an image 106 of a portion of an edge of a substrate obtained by an image obtaining section.

    Fig. 3 illustrates how to identify the position of the portion of the edge of the substrate by means of a position identifying section.

    Fig. 4 shows a curve to illustrate how luminance varies at a step-like portion E.

    Fig. 5 is an explanatory view illustrating the detection conditions under which the detection apparatus operates.

    Fig. 6 is an explanatory view illustrating how to obtain images of three different portions of the edge of the substrate.

    Fig. 7 is an explanatory view illustrating how to obtain images while moving substrates.

    Fig. 8 is an explanatory view illustrating how to obtain images while moving substrates.

    Fig. 9 is an explanatory view illustrating how to judge the outline and position of a substrate based on the detected position of the edge of the substrate.

    Fig. 10 is a front view illustrating an embodiment of scanning incident light.

    Fig. 11 is a front view illustrating the embodiment of scanning incident light.

    Fig. 12 is a front view illustrating another embodiment of scanning incident light.

    Fig. 13 is an explanatory view illustrating how to obtain images of four different portions of the edge of the substrate.

    Fig. 14 is an explanatory view showing an image of a portion of edges of substrates of a three-layered substrate.


    BEST MODE FOR CARRYING OUT THE INVENTION



    [0007] Hereinafter, some embodiments of the present invention will be described. The embodiments does not limit the invention according to the claims, and all the combinations of the features described in the embodiments are not necessarily essential to means provided by aspects of the invention.

    [0008] Fig. 1 is a perspective view schematically illustrating the structure of a detection apparatus 100 relating to an embodiment. The detection apparatus 100 is configured to detect the position of an upper substrate 104, which is stacked on a lower substrate 102. The detection apparatus 100 includes a stage 101, an illuminating section 108, an image obtaining section 110, and a position identifying section 120.

    [0009] The lower substrate 102 and the upper substrate 104 are stacked in the thickness direction by means of a substrate bonding apparatus or the like. The upper substrate 104 is smaller in outline than the lower substrate 102. Therefore, at the edge of the upper substrate 104, a step is formed between the upper surface of the upper substrate 104 and the upper surface of the lower substrate 102.

    [0010] The stage 101 is configured to have the lower substrate 102 and the upper substrate 104 placed thereon for edge detection. The stage 101 is translated along the X, Y and Z axes. The stage 101 may be a stage for use in an apparatus configured to bond another substrate onto the upper substrate 104 or the like. In this case, the stage 101 may be configured to rotate with respect to the X, Y and Z axes. On the upper surface of the stage 101, a reference mark 103 is provided. In the perspective views including Fig. 1, the X and Y axes respectively extend in the left-right direction and back-forth directions within the upper surface of the stage 101. The Z axis extends upwards perpendicularly to the X and Y axes.

    [0011] The reference mark 103 is used to, for example, adjust the illuminating section 108 and the image obtaining section 110. For example, prior to the task of detecting the position of a substrate, the reference mark 103 is used to bring an optical system into a focus to enable an image capturing section 105 to form a sharp image of the reference mark 103 when a slit image 114 is applied to the reference mark 103. Furthermore, the reference mark 103 is used to associate a position on the stage 101 with a position on the image captured by the image capturing section 105.

    [0012] The illuminating section 108 provides the slit image 114 used to detect a position of a substrate. The illuminating section 108 includes a light source 119, a lens 118, a slit 116, and a lens 115 in the stated order.

    [0013] The light source 119 emits light having a wavelength that can be detected by the image capturing section 105, for example, emits visible light when the image capturing section 105 is capable of imaging visible light. The lens 118 collects the light from the light source 119. The slit 116 delimits the illumination used to detect the position of the upper substrate 104. The lens 115 collects the light that has passed through the slit 116 to form the slit image 114 on the upper surface of the lower substrate 102 and the upper surface of the upper substrate 104.

    [0014] The illuminating section 108 illuminates the lower substrate 102 and the upper substrate 104 at angle with respect to the plane orientation of the lower substrate 102 and the upper substrate 104, for example, obliquely downward from top left in Fig. 1. The slit image 114 provided by the illuminating section 108 has, on the lower substrate 102 and the upper substrate 104, an elongated shape extending in the radial direction of the disk-like lower substrate 102 and the upper substrate 104. The area illuminated by the slit image 114 covers a portion of the edge of the upper substrate 104. The illuminating section 108 stores in advance the position where the edge is expected to be positioned when the layered substrate is correctly placed at a predetermined position on the stage 101 and applies illumination to the expected position. The edge is a circumference when the lower substrate 102 and the like are shaped as a disk. The edge may have a characteristic such as a notch.

    [0015] The image obtaining section 110 includes an image capturing section 105 and a lens 112. The image obtaining section 110 images a region covering a portion of the edge of the upper substrate 104 at angle with respect to the plane orientation of the upper substrate 104 and the like, obliquely downward from top right in Fig. 1. In this case, the image obtaining section 110 also stores in advance the position where the edge is expected to be positioned when the layered substrate is correctly placed at a predetermined position on the stage 101 and images a region covering the expected position.

    [0016] The lens 112 focuses the light reflected from the upper surfaces of the lower substrate 102 and the upper substrate 104 onto the image capturing section 105. The examples of the image capturing section 105 include a CCD, a CMOS or the like having two-dimensionally arranged pixels. The image capturing section 105 produces an image 106 by, on the pixel basis, converting the optical signals of the image formed on the image capturing surface into electrical signals. The position identifying section 120 analyzes the image 106 and identifies the position of the edge of the upper substrate 104 based on the position of the step-like portion present in the image 106.

    [0017] The optical systems of the illuminating section 108 and the image obtaining section 110 are not limited to the structures shown in Fig. 1. For example, the lenses 118, 115 and 112 only schematically represent the optical systems and each is not limited to a single lens. The optical system of the image obtaining section 110 may be a non-tilted lens optics or tilted lens optics. If a tilted lens optics is employed, the image obtaining section 110 can focus incoming light within a large region on the surfaces of the upper substrate 104 and the lower substrate 102 that are at angle with respect to the main light ray by tilting the image capturing section 105.

    [0018] The following describes a detection method of detecting the position of the upper substrate 104 using the detection apparatus 100 shown in Fig. 1. The detection method includes a step of obtaining an image and a step of identifying the position. The step of obtaining an image includes a step of applying, by the illuminating section 108, illumination from top left to a region covering a portion of the edge of the upper substrate 104 to form the slit image 114, and a step of imaging, by the image capturing section 105, at angle with respect to the plane orientation of the lower substrate 102 and the upper substrate 104, the slit image 114 reflected by the upper surfaces of the lower substrate 102 and the upper substrate 104 to obtain the image 106.

    [0019] Fig. 2 is an explanatory view illustrating the image 106 of a portion of the edge of the substrate obtained by the image obtaining section 110. In the image 106, an upper substrate reflected image 132 corresponds to a portion of the slit image 114 that is reflected by the upper substrate 104. On the other hand, a lower substrate reflected image 134 corresponds to a portion of the slit image 114 that is reflected by the lower substrate 102.

    [0020] The step of identifying the position includes a step of forwarding the image 106 from the image capturing section 105 to the position identifying section 120 and a step of performing image analysis by the position identifying section 120 to identify the position of the edge of the upper substrate 104 based on the position of the step-like portion E present between the upper substrate reflected image 132 and the lower substrate reflected image 134.

    [0021] The position of the step-like portion E in the image 106 corresponds to the position of the edge of the upper substrate 104. In Fig. 1, when the edge of the upper substrate 104 is moved toward the back within the region illuminated by the slit image 114, the position of the step-like portion E is moved to the left in the image 106. On the other hand, when the edge of the upper substrate 104 is moved toward the front in Fig. 1, the position of the step-like portion E is moved to the right in the image 106. Thus, the position of the edge of the upper substrate 104 can be identified by analyzing the position of the step-like portion E.

    [0022] The position identifying section 120 stores thereon in advance the vertical width D of the upper substrate reflected image 132 based on the size of the slit 116, the optical magnifications of the illuminating section 108 and the image obtaining section 110, and the like. The position identifying section 120 stores thereon in advance a maximum value Lmax of the horizontal width L of the upper substrate reflected image 132 based on the size of the slit 116, the optical magnifications of the illuminating section 108 and the image obtaining section 110 and the like.

    [0023] To analyze the image 106, a selection window 136 is first used to select a region of the image to be analyzed. In order to identify the upper and lower boundaries of the upper substrate reflected image 132 in the image 106, the vertical width b of the selection window 136 is preferably larger than the width D and the horizontal width a of the selection window 136 is preferably smaller than the width Lmax. Since the upper substrate reflected image 132 has higher luminance than the surrounding, the position identifying section 120 can identify the upper and lower boundaries and the width D of the upper substrate reflected image 132 by analyzing the vertical luminance variation in the image selected by the selection window 136.

    [0024] Fig. 3 illustrates how to identify the position of the step-like portion. In order to identify the position of the step-like portion E, the horizontal width a of the selection window 136 is preferably larger than the width Lmax, and the vertical width b of the selection window 136 is preferably smaller than the width D. The position identifying section 120 can identify the position of the step-like portion E by analyzing the horizontal luminance variation in the image selected by the selection window 136. The position of the step-like portion E in the image 106 can be used to identify the position, on the stage 101, the edge of the upper substrate 104 in the region illuminated by the slit image 114.

    [0025] Fig. 4 is a curve to illustrate how the luminance varies at the step-like portion E of the upper substrate reflected image 132 present in the image 106. In Fig. 4, the horizontal axis represents the horizontal coordinates in the image 106 shown in Fig. 2 and the like, and the vertical axis represents the luminance. Fig. 4 shows the luminance variation in the upper substrate reflected image 132. The upper substrate reflected image 132 is ideally expected to exhibit sharp luminance variation at the step-like portion E as indicated by a polygonal line 142. In reality, however, the luminance of the upper substrate reflected image 132 gradually varies around the step-like portion E as shown by a curve 144 due to the aberration of the optical systems and the like. Here, the half width Sx of the region in which the luminance gradually varies is referred to as a blurring amount.

    [0026] The blurring amount Sx caused by the diffraction on the image capturing surface is on the order of βλ/NA, where β denotes the imaging magnification ratio of the optical system, λ denotes the wavelength of the incident light and NA denotes the numerical aperture of the lens. To accurately identify the step-like portion E, three or more measurements are preferably included within the range of the blurring amount. For example, when the image capturing section 105 is formed by using a CCD, three or more pixels are included within the range of Sx under the condition of (βλ/NA) > 3u, where u denotes the size of the pixel of the CCD. In other words, the condition is transformed into NA < (βλ/ 3u).

    [0027] For example, when β = 1, u = 5 µm, and λ = 0.67 µm, NA < 0.045. This conditional expression for NA represents the preferable upper limit for NA when the variables β, u and λ take the above-mentioned values. When a tilted lens optics is used, the variable β is replaced with the lateral magnification β' of the tilted lens optics.

    [0028] Fig. 5 is an explanatory view illustrating other conditions. In addition to the blurring amount Sx shown in Fig. 4, there is a blurring amount Sy in the vertical direction of the upper substrate reflected image 132 and the lower substrate reflected image 134. In order to identify the step-like portion E, the height H of the step-like portion E is preferably larger than (Sy + mu), where m denotes the number of pixels used to identify the step-like portion E and thus is an integer of 1 or more. Considering that the blurring amount Sy is also on the order of βλ/NA, the following expression is preferably satisfied.



    [0029] The height H of the step-like portion E corresponds to the interval h between the upper surface of the lower substrate 102 and the upper surface of the upper substrate 104, in other words, to the thickness of the upper substrate 104. The value of the height H is defined by the following expression.



    [0030] Here, h denotes the distance between the upper surface of the lower substrate 102 and the upper surface of the upper substrate 104, and θi denotes the incident angle of the incident light. When a tilted lens optics is used, the variable β is replaced with the lateral magnification β' of the tilted lens optics.

    [0031] When θi is 90°, sinθi takes a maximum value of 1. Thus, the maximum value of H can be represented by the following expression.



    [0032] By substituting Expression (3) into Expression (1), the following expression is obtained.



    [0033] For example, when β = 1, u = 5 µm, λ = 0.67 µm and m = 1, the condition of NA > 0.0012 should be satisfied in order to detect a distance h of 30 µm. This conditional expression for NA represents the preferable lower limit for NA when the variables β, u, λ and m take the above-mentioned values.

    [0034] In order to more accurately detect the shape of the upper substrate 104, the incident plane including the incident light and the reflected light is preferably in contact with the edge of the upper substrate 104. If the incident plane is off the tangential line direction of the substrate, the detected results may contain errors. In order to reduce such errors to fall within the acceptable range, the angle formed between the incident plane and the tangential line direction of the upper substrate 104 is preferably adjusted to be 5° or less.

    [0035] Fig. 6 illustrates another embodiment of how to identify the position of the edge of the upper substrate 104. According to this embodiment, the position of the upper substrate 104 is identified by applying slit images 114, 172 and 174 to three different portions on the edge of the upper substrate 104 and obtaining images formed by the reflections from the respective portions. In this case, detection apparatuses 100 corresponding to the slit images 114, 172 and 174 are provided. Each of the detection apparatuses 100 can identify the position of the edge of a corresponding one of the portions in accordance with the above-described detection method.

    [0036] In this embodiment, provided that the shape of the upper substrate 104 is known in advance, the position of the upper substrate 104 on the stage 101 can be more accurately detected by identifying the positions of three different portions of the edge of the upper substrate 104 in the image 106. For example, if the upper substrate 104 is shaped like a disk, the position of the center of the upper substrate 104 and the radius of the upper substrate 104 can be identified by identifying the positions of three different portions of the edge of the upper substrate 104. Thus, the position of the upper substrate 104 can be accurately detected. This embodiment can not only achieve highly efficient detection but also reduce the errors that may occur when a plurality of portions of a substrate are detected by moving the substrate.

    [0037] Figs. 7 and 8 illustrate further different embodiments of identifying the position of the edge of the upper substrate 104. Figs. 7 and 8 illustrate operations performed after the operation shown in Fig. 6. According to this embodiment, the upper substrate 104 and the like are moved relative to the regions to be imaged and the illumination to be applied to the upper substrate 104 and the like, in order to obtain images of a plurality of portions and identify a characteristic such as a notch.

    [0038] In this case, as shown in Fig. 7, the slit image 114 illuminates the region that includes the notch of the upper substrate 104, the slit image 172 illuminates the position 90 degrees rotated away from the notch, and the slit image 174 illuminates the position 180 degrees rotated away from the notch. The slit images 114, 172 and 174 provide elongated illumination extending in the radial direction of the upper substrate 104 at their respective positions. Each of the detection apparatuses 100 obtains an image of a corresponding one of the regions and identifies the position of a corresponding portion of the edge. As shown in Fig. 7, the identification of the notch of the upper substrate 104 can identify how much the upper substrate 104 has rotated.

    [0039] In Figs. 6 to 8, the slit images 114 and 174 longitudinally extend in the Y axis and the slit image 172 longitudinally extents in the X axis. In this case, the incident plane of the slit image 114 or 174 is vertical to the Y axis. The incident plane of the slit image 172 is vertical to the X axis.

    [0040] In Figs. 7 and 8, the stage 101 is moved in the X direction, or to the right in Figs. 7 and 8, from the position of the upper substrate 104 and the like shown in Fig. 6, as a result of which the upper substrate 104 and the lower substrate 102 are moved and a plurality of different portions of the edge are thus detected. In this case, the image obtaining section 110 captures a plurality of images 106, while the illuminating section 108 and the image obtaining section 110 remains stationary and the stage 101 is moved at a constant rate. Here, the stage 101 may be moved intermittently and the images 106 may be obtained while the stage 101 is temporarily stationary.

    [0041] Fig. 9 illustrates, as an example, the information about the plurality of different portions of the edge that is obtained by the embodiment shown in Figs. 6 to 8. Fig. 9 shows the position (Y1, Y2,...) of the step-like portion in the image 106 corresponding to the slit image 114 in Figs. 6 to 8 and the associated position (XI, X2,...) on the X axis of the stage 101. Based on the shown information, the position of the notch on the stage 101 can be identified in the X and Y axes.

    [0042] The method in which a plurality of portions of the edge are identified by moving the upper substrate 104 and the like is not limited to the case shown in Figs. 6 to 8 where three slit images are applied to the layered substrate, in other words, the case where the three detection apparatuses 100 are used to identify three different points on the layered substrate. This method is applicable to a case where a single point is identified as shown in Fig. 1, and to a case where more or less than three points are identified. In the case where a single point is identified, the position and shape of the upper substrate 104 can be detected by moving the upper substrate 104 and the like to identify a plurality of portions of the edge.

    [0043] When the stage 101 is moved, vibration and the like may occur and change the relative positions of the stage 101 and the image capturing section 105 during the movement. This may lead to erroneously identified positions. When a plurality of portions are identified as shown in Figs. 6 to 8, on the other hand, the detection apparatuses 100 associated with the slit images 114 and 174 detect the variation in position resulting from the vibration in the Y axis. The detected variation in position resulting from the vibration in the Y axis is used to correct the value in the Y axis included in the position information identified by using the slit image 172. Similarly, the detection apparatus 100 associated with the slit image 172 detects the variation in position resulting from the vibration in the X axis. The detected variation in position resulting from the vibration in the X axis is used to correct the value in the X axis included in the position information identified by using the slit images 114 and 174. Such correction enables the shape and position of the upper substrate 104 to be more accurately detected.

    [0044] Figs. 10 and 11 are front views to illustrate an embodiment of scanning the illumination. According to the embodiment shown in Figs. 10 and 11, a plurality of portions of the edge are detected by moving the illumination to illuminate different regions, instead of moving the stage 101.

    [0045] As shown in Fig. 10, the illuminating section 108 has a parallel plate glass 182 on the side of the image with respect to the lens 115, in other words, between the lens 115 and the layered substrate. Since the entrance surface of the parallel plate glass 182 is oriented vertical to the principal light ray in Fig. 10, the illumination is applied to the position x1 after transmitting through the parallel plate glass 182. Here, the position x1 is on the extended line from the center of the lens in the direction of the principal light ray.

    [0046] As shown in Fig. 11, if the parallel plate glass 182 is tilted at angle with respect to the principal light ray through the lens, the position to which the illumination is applied can be moved from x1 to x2 without changing the angle at which the illumination is incident on the layered substrate. In this way, a plurality of portions of the edge can be detected by changing the angle of the parallel plate glass 182 to scan the layered substrate with the illumination.

    [0047] Fig. 12 is a front view illustrating another embodiment of scanning the illumination. In Fig. 12, the illuminating section 108 has a mirror 184 at the position of the pupil. The slit image 114 can be applied to different positions by changing the angle of the mirror 184.

    [0048] According to the embodiments shown in Figs. 10 to 12, it is not necessary to move the stage 101 having the layered substrate placed thereon. As a result, the detection apparatus 100 as a whole can be made more compact.

    [0049] In order to more accurately detect the shape of the upper substrate 104, the incident plane including the incident light and the reflected light is preferably in contact with the edge of the upper substrate 104. If the stage 101 or incident light is moved within a large area, a large angle may be formed between the edge and the incident plane within the detectable region, in which case the detected result may be less accurate (see Figs. 6 to 8). Therefore, if the measurement is performed while the stage 101 or incident light is moved, it is preferable to limit the movable range of the stage 101 or incident light. For example, when the slit 116 having a width of 0.065 mm is used to detect the edge of the upper substrate 104 having a size of approximately 300 mm, the upper substrate 104 is pre-aligned in such a manner that the notch is oriented in the Y direction with respect to the center of the substrate and then arranged on the stage 101, and the edge of the upper substrate 104 may be accurately detected while the stage 101 or incident light is moved within 5 mm or less.

    [0050] Fig. 13 illustrates an embodiment of identifying four different portions. In addition to the slit images 114, 172 and 174 shown in Fig. 6, a slit image 188 is additionally provided that is applied to the position 180 degrees rotated away from the slit image 172. In this way, four different portions of the edge can be identified simultaneously. In this case, if one of the four slit images is applied to the position of the notch of the upper substrate 104, the other three slit images can be used to simultaneously detect the position of the center of the upper substrate 104.

    [0051] Fig. 14 is an explanatory view showing an image of a portion of edges of substrates that can be obtained by means of the detection apparatus 100 shown in Fig. 1 when three substrates with different sizes are stacked on each other. For example, when a substrate larger than the lower substrate 102 is placed under the lower substrate 102 in Fig. 1, the three substrates produce, in the image 106, the upper substrate reflected image 132, the lower substrate reflected image 134 and a three-layered substrate reflected image 192 from the above. In this case, the above-described method can be used to detect the position of the edge of the uppermost upper substrate 104 as long as a sufficiently distinguishable step-like portion E is present in association with the edge of the upper substrate 104 in the upper substrate reflected image 132.

    [0052] As is apparent from the above, the present embodiment enables an apparatus configured to manufacture a layered semiconductor apparatus by bonding a plurality of substrates together to accurately detect the outlines and positions of the substrates to be bonded together. In this way, the substrates to be bonded together can be accurately aligned with each other.

    [0053] In the above-described embodiment, the image obtaining section 110 is positioned to obtain an image formed by the specular reflection of the illumination applied at angle by the illuminating section 108. However, the arrangement of the illuminating section 108 and the image obtaining section 110 is not limited to such. As an alternative example, the illuminating section 108 may be at angle with respect to the plane of the substrate, and the image obtaining section 110 may obtain an image in the normal direction of the plane orientation of the substrate. As a further alternative example, the illuminating section 108 may apply the illumination in the normal direction to the plane of the substrate, and the image obtaining section 110 may obtain an image at angle with respect to the plane orientation of the substrate. As a yet further alternative example, the illuminating section 108 and the image obtaining section 110 may be both positioned at angle with respect to the plane of the substrate and off the specular reflection.

    [0054] In the above-described embodiment, the slit image 114 is used as the illumination. However, the examples of the illumination are not limited to such. As an alternative, the negative-positive relation between the slit image 114 and the surrounding may be reversed. In other words, the illumination may be designed to have a slit-like shade surrounded by light. In this case, the illumination is preferably patterned to extend in the radial direction of the substrate when the substrate is circular.

    [0055] While the embodiments of the present invention have been described, the technical scope of the invention is not limited to the above described embodiments. It is apparent to persons skilled in the art that various alterations and improvements can be added to the above-described embodiments. It is also apparent from the scope of the claims that the embodiments added with such alterations or improvements can be included in the technical scope of the invention which is defined by the appended claims.

    [0056] The operations, procedures, steps, and stages of each process performed by an apparatus, system, program, and method shown in the claims, embodiments, or diagrams can be performed in any order as long as the order is not indicated by "prior to," "before," or the like and as long as the output from a previous process is not used in a later process. Even if the process flow is described using phrases such as "first" or "next" in the claims, embodiments, or diagrams, it does not necessarily mean that the process must be performed in this order.


    Claims

    1. A detection method of detecting a position of an uppermost substrate of a plurality of substrates stacked on each other, the detection method comprising:

    applying illumination to a region covering a portion of an edge of the uppermost substrate (104) and a portion of a lower substrate (102) stacked with the uppermost substrate (104);

    obtaining at least one of a reflection image formed by reflection from the uppermost substrate (104) and a reflection image formed by reflection from the lower substrate (102),

    identifying a position of the edge of the uppermost substrate (104) using the obtained image based on a position of a step-like portion (E) present in the region due to a step formed between the uppermost substrate (104) and the lower substrate (102); and

    identifying a position of the uppermost substrate (104) based on the position of the edge of the uppermost substrate (104).


     
    2. The detection method as set forth in Claim 1, wherein
    in the identifying the position of the edge of the uppermost substrate (104), the step-like portion (E) is identified based on the reflection image formed by the reflection from the uppermost substrate (104) or the reflection image formed by the reflection from the lower substrate (102).
     
    3. The detection method as set forth in Claim 2, wherein
    at least one of (i) a direction in which the illumination is applied to the region in the applying and (ii) a direction in which the region is imaged in the obtaining is at angle with respect to a plane orientation of the uppermost substrate (104), and
    in the identifying the position of the edge of the uppermost substrate (104), the step-like portion is detected by detecting a difference between an illuminated portion of the uppermost substrate (104) by the applied illumination and an illuminated portion of the lower substrate (102) by the applied illumination.
     
    4. The detection method as set forth in one of Claims 2 and 3, wherein
    in the identifying the position of the edge of the uppermost substrate, the step-like portion (E) is detected based on a difference in luminance between the reflection image formed by the reflection from the uppermost substrate (104) and the reflection image formed by the reflection from the lower substrate (102).
     
    5. The detection method as set forth in Claim 2, wherein
    the obtaining includes obtaining an image formed by specular reflection from the region to which the illumination is applied in the applying.
     
    6. The detection method as set forth in one of Claims 1 to 3, wherein
    in the applying, a plane formed by incident light and reflected light of the illumination intersects a radial direction of the uppermost substrate, when the uppermost substrate (104) is circular.
     
    7. The detection method as set forth in one of Claims 1 to 6, wherein
    in the applying, the illumination is patterned to extend in a radial direction of the uppermost substrate (104), when the uppermost substrate (104) is circular.
     
    8. The detection method as set forth in one of Claims 2 to 6, further comprising
    moving the plurality of substrates in a direction intersecting a radial direction of the uppermost substrate (104), when the uppermost substrate (104) is circular, wherein
    in the obtaining, images of the region are obtained, at a plurality of positions of the uppermost substrate that are created by the movement of the substrates, by imaging the region at angle with respect to a plane orientation of the uppermost substrate (104), and
    in the identifying the position of the uppermost substrate (104), a position of a characteristic included in the edge is detected based on the position of the step-like portion (E) in the images obtained at the plurality of positions of the uppermost substrate (104).
     
    9. The detection method as set forth in Claim 7, wherein
    in the applying, the illumination is scanned in a direction intersecting the radial direction,
    in the obtaining, images of the region are obtained, at a plurality of positions of the illumination created by the scanning of the illumination, by imaging the region at angle with respect to the plane orientation of the uppermost substrate (104), and
    in the identifying the position of the uppermost substrate, a position of a characteristic included in the edge is identified based on the position of the step-like portion (E) in the images obtained at the plurality of positions of the illumination.
     
    10. The detection method as set forth in Claim 2, wherein
    when the edge of the uppermost substrate (104) includes a notch, the obtaining includes obtaining images by imaging a region of the uppermost substrate (104) that includes the notch and other region of the uppermost substrate (104), and
    the identifying the position of the uppermost substrate (104) includes identifying the position of the uppermost substrate (104) based on the image obtained by imaging the other region and identifying a rotating direction of the notch based on the image obtained by imaging the region including the notch.
     
    11. The detection method as set forth in Claim 10, wherein
    the obtaining includes obtaining images by imaging, as the other region, a plurality of different regions.
     
    12. The detection method as set forth in one of Claims 10 and 11, wherein
    in the applying, when the uppermost substrate (104) is substantially disk-like, illumination that is patterned to extend in a radial direction of the uppermost substrate (104) is applied to the region including the notch and the other region at angle with respect to the plane orientation of the uppermost substrate (104).
     
    13. The detection method as set forth in Claim 12, further comprising
    moving the plurality of substrates in a direction intersecting the radial direction, wherein
    in the obtaining, images of the region including the notch are at least obtained, at a plurality of positions of the uppermost substrate (104) created by the movement of the substrates, by imaging the region including the notch at angle with respect to the plane orientation of the uppermost, substrate (104), and
    in the identifying the position of the edge, a position of a characteristic included in the notch is detected based on the position of the step-like portion (E) of the images obtained at the plurality of positions of the uppermost substrate (104).
     
    14. The detection method as set forth in Claim 13, wherein
    in the obtaining, images of the other region are obtained, at a plurality of positions created by the movement of the, substrates, by imaging the other region at angle with respect to the plane orientation of the uppermost substrate, and
    the detection method further comprises
    correcting the position of the characteristic included in the notch, based on variation in the position of the edge among the images of the other region that are obtained at the plurality of positions.
     
    15. The detection method as set forth in Claim 12, wherein
    in the applying, the illumination is scanned in a direction intersecting the radial direction,
    in the obtaining, images of the region including the notch are obtained, at a plurality of positions of the illumination, by imaging the region including the notch at angle with respect to the plane orientation of the uppermost substrate (104), and
    in the identifying the position of the uppermost substrate (104), a position of a characteristic included in the notch is identified based on the position of the step-like portion (E) in the images obtained at the plurality of positions of the illumination.
     
    16. The detection method as set forth in one of Claims 10 to 15, wherein
    in the obtaining, the other region includes a position 90 degrees rotated away from the notch when the uppermost substrate (104) is circular.
     
    17. The detection method as set forth in one of Claims 10 to 15, wherein
    in the obtaining, the other region includes a position 180 degrees rotated away from the notch when the uppermost substrate (104) is circular.
     
    18. The detection method as set forth in one of Claims 2 to 17, wherein
    in the obtaining, a tilted lens optics (115) is used.
     
    19. A detection apparatus configured for detecting a, position of an uppermost, substrate (104) of a plurality of substrates stacked on each other, the detection apparatus comprising:

    an illuminating section (108) configured for applying illumination to a region covering a portion of an edge of the uppermost substrate (104) and a portion of a lower substrate (102) stacked with the uppermost substrate (104);

    an image obtaining section (110) configured for obtaining at least one of a reflection image formed by reflection from the uppermost substrate (104) and a reflection image formed by reflection from the lower substrate (102), and

    a position identifying section (120) configured for identifying a position of the edge of the uppermost substrate (104) using the image obtained by the image obtaining section based on a position of a step-like portion (E) present in the region due to a step formed between the uppermost substrate (104) and the lower substrate (102).


     
    20. The detection apparatus as set forth in Claim 19, wherein
    the position identifying section (120) identifies the step-like portion (E) based on the reflection image formed by the reflection from the uppermost substrate (104) or the reflection image formed by the reflection from the lower substrate (102).
     
    21. The detection apparatus as set forth in Claim 20, wherein
    at least one of (i) a direction in which the illuminating section applies the illumination to the region and (ii) a direction in which the image obtaining section images the region is at angle with respect to the plane orientation of the uppermost substrate (104), and
    the position identifying section detects the step-like portion (E) by detecting a difference between an illuminated portion of the uppermost substrate (104) by the applied illumination and an illuminated portion of the lower substrate (102) by the applied illumination.
     
    22. The detection apparatus as set forth in one of Claims 20 and 21, wherein
    the position identifying section detects the step-like portion (E) based on a difference in luminance between the reflection image formed by the reflection from the uppermost substrate (104) and the reflection image formed by the reflection from the lower substrate (102).
     
    23. The detection apparatus as set forth in Claim 19, wherein
    the illuminating section (108) applies the illumination in such a manner that a plane formed by incident light and reflected light of the illumination intersects a radial direction of the uppermost substrate (104), when the uppermost substrate (104) is circular.
     
    24. The detection apparatus as set forth in Claim 20, wherein
    a plurality of the illuminating sections (108) are provided to apply illumination to a plurality of regions of the uppermost substrate (104), and a plurality of the image obtaining sections (110) are provided in correspondence with the plurality of illuminating sections (108).
     


    Ansprüche

    1. Erfassungsverfahren zum Erfassen einer Position eines obersten Substrats einer Vielzahl von aufeinander gestapelten Substraten, wobei das Erfassungsverfahren aufweist:

    Aufbringen von Beleuchtung auf einen Bereich, der einen Abschnitt eines Randes des obersten Substrats (104) und einen Abschnitt eines unteren Substrats (102), das mit dem obersten Substrat (104) gestapelt ist, abdeckt;

    Erhalten von einem Reflexionsbild, das durch Reflexion von dem obersten Substrat (104) gebildet wird, und/oder einem Reflexionsbild, das durch Reflexion von dem unteren Substrat (102) gebildet wird,

    Identifizieren einer Position des Randes des obersten Substrats (104) unter Verwendung des erhaltenen Bildes auf der Grundlage einer Position eines stufenartigen Abschnitts (E), der in dem Bereich aufgrund einer zwischen dem obersten Substrat (104) und dem unteren Substrat (102) gebildeten Stufe vorhanden ist; und

    Identifizieren einer Position des obersten Substrats (104) auf der Grundlage der Position des Randes des obersten Substrats (104).


     
    2. Erfassungsverfahren nach Anspruch 1, wobei
    bei dem Identifizieren der Position des Randes des obersten Substrats (104) der stufenartige Abschnitt (E) auf der Grundlage des durch die Reflexion von dem obersten Substrat (104) gebildeten Reflexionsbildes oder des durch die Reflexion von dem unteren Substrat (102) gebildeten Reflexionsbildes identifiziert wird.
     
    3. Erfassungsverfahren nach Anspruch 2, wobei
    mindestens eine von (i) einer Richtung, in der die Beleuchtung auf den Bereich bei dem Aufbringen aufgebracht wird, und (ii) einer Richtung, in der der Bereich bei dem Erhalten abgebildet wird, unter einem Winkel in Bezug auf eine Ebenenausrichtung des obersten Substrats (104) steht, und
    bei dem Identifizieren der Position des Randes des obersten Substrats (104) der stufenartige Abschnitt durch Erfassen eines Unterschieds zwischen einem beleuchteten Abschnitt des obersten Substrats (104) durch die aufgebrachte Beleuchtung und einem beleuchteten Abschnitt des unteren Substrats (102) durch die aufgebrachte Beleuchtung erfasst.
     
    4. Erfassungsverfahren nach einem der Ansprüche 2 und 3, wobei
    bei dem Identifizieren der Position des Randes des obersten Substrats der stufenartige Abschnitt (E) auf der Grundlage eines Unterschieds der Leuchtdichte zwischen dem durch die Reflexion von dem obersten Substrat (104) gebildeten Reflexionsbild und dem durch die Reflexion von dem unteren Substrat (102) gebildeten Reflexionsbild erfasst.
     
    5. Erfassungsverfahren nach Anspruch 2, wobei
    das Erhalten das Erhalten eines Bildes aufweist, das durch gerichtete Reflexion von dem Bereich, auf den die Beleuchtung bei dem Aufbringen aufgebracht wird, gebildet wird.
     
    6. Erfassungsverfahren nach einem der Ansprüche 1 bis 3, wobei
    bei dem Aufbringen eine Ebene, die durch einfallendes Licht und reflektiertes Licht der Beleuchtung gebildet wird, eine radiale Richtung des obersten Substrats schneidet, wenn das oberste Substrat (104) kreisförmig ist.
     
    7. Erfassungsverfahren nach einem der Ansprüche 1 bis 6, wobei
    bei dem Aufbringen die Beleuchtung so strukturiert ist, dass sie sich in radialer Richtung des obersten Substrats (104) erstreckt, wenn das oberste Substrat (104) kreisförmig ist.
     
    8. Erfassungsverfahren nach einem der Ansprüche 2 bis 6, ferner aufweisend:

    Bewegen der Vielzahl von Substraten in einer Richtung, die eine radiale Richtung des obersten Substrats (104) schneidet, wenn das oberste Substrat (104) kreisförmig ist, wobei

    bei dem Erhalten Bilder des Bereichs an einer Vielzahl von Positionen des obersten Substrats erhalten werden, die durch die Bewegung der Substrate erzeugt werden, indem der Bereich unter einem Winkel in Bezug auf eine Ebenenausrichtung des obersten Substrats abgebildet wird (104), und

    bei dem Identifizieren der Position des obersten Substrats (104) eine Position einer in dem Rand enthaltenen Charakteristik auf der Grundlage der Position des stufenartigen Abschnitts (E) in den Bildern erfasst wird, die an der Vielzahl von Positionen des obersten Substrats (104) erhalten werden.


     
    9. Erfassungsverfahren nach Anspruch 7, wobei
    bei dem Aufbringen die Beleuchtung in einer Richtung abgetastet wird, die die radiale Richtung schneidet,
    bei dem Erhalten Bilder des Bereichs an einer Vielzahl von Positionen von der Beleuchtung erhalten werden, die durch das Abtasten der Beleuchtung erzeugt wird, indem der Bereich unter einem Winkel in Bezug auf die Ebenenausrichtung des obersten Substrats (104) abgebildet wird, und
    bei dem Identifizieren der Position des obersten Substrats eine Position einer in dem Rand enthaltenen Charakteristik auf der Grundlage der Position des stufenartigen Abschnitts (E) in den Bildern identifiziert wird, die an der Vielzahl von Positionen des obersten Substrats (104) erhalten werden.
     
    10. Erfassungsverfahren nach Anspruch 2, wobei
    wenn der Rand des obersten Substrats (104) eine Kerbe enthält, das Erhalten das Erhalten von Bildern durch Abbilden eines Bereichs des obersten Substrats (104), der die Kerbe und einen anderen Bereich des obersten Substrats (104) enthält, aufweist, und
    das Identifizieren der Position des obersten Substrats (104) das Identifizieren der Position des obersten Substrats (104) auf der Grundlage des Bildes, das durch Abbilden des anderen Bereichs erhalten wird, und das Identifizieren einer Drehrichtung der Kerbe auf der Grundlage des Bildes, das durch Abbilden des Bereichs aufweisend die Kerbe erhalten wird, umfasst.
     
    11. Erfassungsverfahren nach Anspruch 10, wobei
    das Erhalten das Erhalten von Bildern durch Abbilden, als den anderen Bereich, einer Vielzahl verschiedener Regionen aufweist.
     
    12. Erfassungsverfahren nach einem der Ansprüche 10 und 11, wobei
    bei dem Aufbringen, wenn das oberste Substrat (104) im Wesentlichen scheibenförmig ist, eine Beleuchtung, die so strukturiert ist, dass sie sich in einer radialen Richtung des obersten Substrats (104) erstreckt, auf den Bereich aufweisend die Kerbe und den anderen Bereich unter einem Winkel in Bezug auf die Ebenenausrichtung des obersten Substrats (104) aufgebracht wird.
     
    13. Erfassungsverfahren gemäß Anspruch 12, ferner aufweisend
    Bewegen der Vielzahl von Substraten in einer Richtung, die die radiale Richtung schneidet, wobei
    bei dem Erhalten zumindest Bilder des Bereichs aufweisend die Kerbe an einer Vielzahl von Positionen des obersten Substrats (104) erhalten werden, die durch die Bewegung der Substrate erzeugt werden, indem der Bereich aufweisend die Kerbe unter einem Winkel in Bezug auf die Ebenenausrichtung des obersten Substrats (104) abgebildet wird, und
    bei dem Identifizieren der Position des Randes eine Position einer in der Kerbe enthaltenen Charakteristik auf der Grundlage der Position des stufenartigen Abschnitts (E) der Bilder erfasst wird, die an der Vielzahl von Positionen des obersten Substrats (104) erhalten werden.
     
    14. Erfassungsverfahren gemäß Anspruch 13, wobei
    bei dem Erhalten Bilder des anderen Bereichs an einer Vielzahl von Positionen erhalten werden, die durch die Bewegung der Substrate erzeugt werden, indem der andere Bereich unter einem Winkel in Bezug auf die Ebenenausrichtung des obersten Substrats abgebildet wird, und das Erfassungsverfahren ferner aufweist
    Korrigieren der Position der in der Kerbe enthaltenen Charakteristik auf der Grundlage der Variation der Position des Randes zwischen den Bildern des anderen Bereichs, die an der Vielzahl der Positionen erhalten werden.
     
    15. Erfassungsverfahren gemäß Anspruch 12, wobei
    bei dem Aufbringen die Beleuchtung in einer Richtung abgetastet wird, die die radiale Richtung schneidet,
    bei dem Erhalten Bilder des Bereichs aufweisend die Kerbe an einer Vielzahl von Positionen der Beleuchtung erhalten werden, indem der Bereich aufweisend die Kerbe unter einem Winkel in Bezug auf die Ebenenausrichtung des obersten Substrats (104) abgebildet wird, und
    bei dem Identifizieren der Position des obersten Substrats (104) eine Position einer in dem Rand enthaltenen Charakteristik auf der Grundlage der Position des stufenartigen Abschnitts (E) in den Bildern erfasst wird, die an der Vielzahl von Positionen der Beleuchtung erhalten werden.
     
    16. Erfassungsverfahren nach einem der Ansprüche 10 bis 15, wobei
    bei dem Erhalten, der andere Bereich eine um 90 Grad von der Kerbe weggedrehte Position aufweist, wenn das oberste Substrat (104) kreisförmig ist.
     
    17. Erfassungsverfahren nach einem der Ansprüche 10 bis 15, wobei
    bei dem Erhalten, der andere Bereich eine um 180 Grad von der Kerbe weggedrehte Position aufweist, wenn das oberste Substrat (104) kreisförmig ist.
     
    18. Erfassungsverfahren nach einem der Ansprüche 2 bis 17, wobei
    bei dem Erhalten wird eine geneigte Linsenoptik (115) verwendet wird.
     
    19. Erfassungsvorrichtung, die zum Erfassen einer Position eines obersten Substrats (104) einer Vielzahl von aufeinander gestapelten Substraten eingerichtet ist, wobei die Erfassungsvorrichtung aufweist:

    einen Beleuchtungsabschnitt (108), der zum Aufbringen von Beleuchtung auf einen Bereich, der einen Abschnitt eines Randes des obersten Substrats (104) und einen Abschnitt eines unteren Substrats (102), das mit dem obersten Substrat (104) gestapelt ist, abdeckt, eingerichtet ist;

    einen Bilderhaltungsabschnitt (110), der zum Erhalten von einem Reflexionsbild, das durch Reflexion von dem obersten Substrat (104) gebildet wird, und/oder einem Reflexionsbild, das durch Reflexion von dem unteren Substrat (102) gebildet wird, eingerichtet ist, und

    einen Positionsidentifizierungsabschnitt (120), der zum Identifizieren einer Position des Randes des obersten Substrats (104) unter Verwendung des durch den Bilderhaltungsabschnitt erhaltenen Bildes auf der Grundlage einer Position eines stufenartigen Abschnitts (E) konfiguriert ist, der in dem Bereich aufgrund einer zwischen dem obersten Substrat (104) und dem unteren Substrat (102) gebildeten Stufe vorhanden ist.


     
    20. Erfassungsvorrichtung gemäß Anspruch 19, wobei
    der Positionsidentifizierungsabschnitt (120) den stufenartigen Abschnitt (E) auf der Grundlage des durch die Reflexion von dem obersten Substrat (104) gebildeten Reflexionsbildes oder des durch die Reflexion von dem unteren Substrat (102) gebildeten Reflexionsbildes identifiziert.
     
    21. Erfassungsvorrichtung gemäß Anspruch 20, wobei
    mindestens eine von (i) einer Richtung, in der der Beleuchtungsabschnitt die Beleuchtung auf den Bereich aufbringt, und (ii) einer Richtung, in der der Bilderhaltungsabschnitt den Bereich abbildet, unter einem Winkel in Bezug auf eine Ebenenausrichtung des obersten Substrats (104) steht, und
    der Positionsidentifizierungsabschnitt den stufenartigen Abschnitt (E) durch Erfassen eines Unterschieds zwischen einem beleuchteten Abschnitt des obersten Substrats (104) durch die aufgebrachte Beleuchtung und einem beleuchteten Abschnitt des unteren Substrats (102) durch die aufgebrachte Beleuchtung erfasst.
     
    22. Erfassungsvorrichtung gemäß einem der Ansprüche 20 und 21, wobei
    der Positionsidentifizierungsabschnitt den stufenartigen Abschnitt (E) auf der Grundlage eines Unterschieds der Leuchtdichte zwischen dem durch die Reflexion von dem obersten Substrat (104) gebildeten Reflexionsbild und dem durch die Reflexion von dem unteren Substrat (102) gebildeten Reflexionsbild erfasst.
     
    23. Erfassungsvorrichtung gemäß Anspruch 19, wobei
    der Beleuchtungsabschnitt (108) die Beleuchtung in solch einer Weise aufbringt, dass eine durch einfallendes Licht und reflektiertes Licht der Beleuchtung gebildete Ebene eine radiale Richtung des obersten Substrats (104) schneidet, wenn das oberste Substrat (104) kreisförmig ist.
     
    24. Erfassungsvorrichtung gemäß Anspruch 20, wobei
    eine Vielzahl der Beleuchtungsabschnitte (108) vorgesehen ist, um eine Beleuchtung auf eine Vielzahl von Bereichen des obersten Substrats (104) aufzubringen, und eine Vielzahl der Bilderhaltungsabschnitte (110) in Übereinstimmung mit der Vielzahl der Beleuchtungsabschnitte (108) vorgesehen ist.
     


    Revendications

    1. Procédé de détection d'une position d'un substrat le plus haut d'une pluralité de substrats empilés les uns sur les autres, le procédé de détection comprenant :

    l'application d'illumination à une région couvrant une partie d'un bord du substrat le plus haut (104) et une partie d'un substrat inférieur (102) empilé avec le substrat le plus haut (104) ;

    l'obtention d'au moins l'une d'une image de réflexion formée par réflexion depuis le substrat le plus haut (104) et d'une image de réflexion formée par réflexion depuis le substrat inférieur (102),

    l'identification d'une position du bord du substrat le plus haut (104) à l'aide de l'image obtenue d'après une position d'une partie de type décrochement (E) présente dans la région en raison d'un décrochement formé entre le substrat le plus haut (104) et le substrat inférieur (102) ; et

    l'identification d'une position du substrat le plus haut (104) d'après la position du bord du substrat le plus haut (104).


     
    2. Procédé de détection selon la revendication 1, dans lequel
    à l'identification de la position du bord du substrat le plus haut (104), la partie de type décrochement (E) est identifiée d'après l'image de réflexion formée par la réflexion depuis le substrat le plus haut (104) ou l'image de réflexion formée par la réflexion depuis le substrat inférieur (102).
     
    3. Procédé de détection selon la revendication 2, dans lequel
    au moins l'une (i) d'une direction dans laquelle l'illumination est appliquée à la région à l'application et (ii) d'une direction dans laquelle la région est imagée à l'obtention forme un angle par rapport à une orientation dans le plan du substrat le plus haut (104), et
    à l'identification de la position du bord du substrat le plus haut (104), la partie de type décrochement est détectée en détectant une différence entre une partie illuminée du substrat le plus haut (104) par l'illumination appliquée et une partie illuminée du substrat inférieur (102) par l'illumination appliquée.
     
    4. Procédé de détection selon l'une des revendications 2 et 3, dans lequel
    à l'identification de la position du bord du substrat le plus haut, la partie de type décrochement (E) est détectée d'après une différence de luminance entre l'image de réflexion formée par la réflexion depuis le substrat le plus haut (104) et l'image de réflexion formée par la réflexion depuis le substrat inférieur (102).
     
    5. Procédé de détection selon la revendication 2, dans lequel
    l'obtention inclut l'obtention d'une image formée par réflexion spéculaire depuis la région à laquelle l'illumination est appliquée à l'application.
     
    6. Procédé de détection selon l'une des revendications 1 à 3, dans lequel
    à l'application, un plan formé par une lumière incidente et une lumière réfléchie de l'illumination coupe une direction radiale du substrat le plus haut, lorsque le substrat le plus haut (104) est circulaire.
     
    7. Procédé de détection selon l'une des revendications 1 à 6, dans lequel
    à l'application, l'illumination est à motif pour s'étendre dans une direction radiale du substrat le plus haut (104), lorsque le substrat le plus haut (104) est circulaire.
     
    8. Procédé de détection selon l'une des revendications 2 à 6, comprenant en outre
    le déplacement de la pluralité de substrats dans une direction coupant une direction radiale du substrat le plus haut (104), lorsque le substrat le plus haut (104) est circulaire, dans lequel
    à l'obtention, des images de la région sont obtenues, à une pluralité de positions du substrat le plus haut qui sont créées par le déplacement des substrats, en imageant la région formant un angle par rapport à une orientation dans le plan du substrat le plus haut (104), et
    à l'identification de la position du substrat le plus haut (104), une position d'une caractéristique incluse dans le bord est détectée d'après la position de la partie de type décrochement (E) dans les images obtenues à la pluralité de positions du substrat le plus haut (104).
     
    9. Procédé de détection selon la revendication 7, dans lequel
    à l'application, l'illumination est balayée dans une direction coupant la direction radiale,
    à l'obtention, des images de la région sont obtenues, à une pluralité de positions de l'illumination créées par le balayage de l'illumination, en imageant la région formant un angle par rapport à l'orientation dans le plan du substrat le plus haut (104), et
    à l'identification de la position du substrat le plus haut, une position d'une caractéristique incluse dans le bord est identifiée d'après la position de la partie de type décrochement (E) dans les images obtenues à la pluralité de positions de l'illumination.
     
    10. Procédé de détection selon la revendication 2, dans lequel
    lorsque le bord du substrat le plus haut (104) inclut une encoche, l'obtention inclut l'obtention d'images en imageant une région du substrat le plus haut (104) qui inclut l'encoche et une autre région du substrat le plus haut (104), et
    l'identification de la position du substrat le plus haut (104) inclut l'identification de la position du substrat le plus haut (104) d'après l'image obtenue en imageant l'autre région et l'identification d'une direction de rotation de l'encoche d'après l'image obtenue en imageant la région incluant l'encoche.
     
    11. Procédé de détection selon la revendication 10, dans lequel
    l'obtention inclut l'obtention d'images en imageant, en tant qu'autre région, une pluralité de régions différentes.
     
    12. Procédé de détection selon l'une des revendications 10 et 11, dans lequel
    à l'application, lorsque le substrat le plus haut (104) est sensiblement de type disque, l'illumination qui est à motif pour s'étendre dans une direction radiale du substrat le plus haut (104) est appliquée à la région incluant l'encoche et l'autre région formant un angle par rapport à l'orientation dans le plan du substrat le plus haut (104).
     
    13. Procédé de détection selon la revendication 12, comprenant en outre
    le déplacement de la pluralité de substrats dans une direction coupant la direction radiale, dans lequel
    à l'obtention, des images de la région incluant l'encoche sont au moins obtenues, à une pluralité de positions du substrat le plus haut (104) créées par le déplacement des substrats, en imageant la région incluant l'encoche formant un angle par rapport à l'orientation dans le plan du substrat le plus haut (104), et
    à l'identification de la position du bord, une position d'une caractéristique incluse dans l'encoche est détectée d'après la position de la partie de type décrochement (E) des images obtenues à la pluralité de positions du substrat le plus haut (104).
     
    14. Procédé de détection selon la revendication 13, dans lequel
    à l'obtention, des images de l'autre région sont obtenues, à une pluralité de positions créées par le déplacement des substrats, en imageant l'autre région formant un angle par rapport à l'orientation dans le plan du substrat le plus haut, et
    le procédé de détection comprend en outre
    la correction de la position de la caractéristique incluse dans l'encoche, d'après une variation de la position du bord parmi les images de l'autre région qui sont obtenues à la pluralité de positions.
     
    15. Procédé de détection selon la revendication 12, dans lequel
    à l'application, l'illumination est balayée dans une direction coupant la direction radiale,
    à l'obtention, des images de la région incluant l'encoche sont obtenues, à une pluralité de positions de l'illumination, en imageant la région incluant l'encoche formant un angle par rapport à l'orientation dans le plan du substrat le plus haut (104), et
    à l'identification de la position du substrat le plus haut (104), une position d'une caractéristique incluse dans l'encoche est identifiée d'après la position de la partie de type décrochement (E) dans les images obtenues à la pluralité de positions de l'illumination.
     
    16. Procédé de détection selon l'une des revendications 10 à 15, dans lequel
    à l'obtention, l'autre région inclut une position tournée de 90 degrés à l'opposé de l'encoche lorsque le substrat le plus haut (104) est circulaire.
     
    17. Procédé de détection selon l'une des revendications 10 à 15, dans lequel
    à l'obtention, l'autre région inclut une position tournée de 180 degrés à l'opposé de l'encoche lorsque le substrat le plus haut (104) est circulaire.
     
    18. Procédé de détection selon l'une des revendications 2 à 17, dans lequel
    à l'obtention, une optique à lentille inclinée (115) est utilisée.
     
    19. Appareil de détection configuré pour détecter une position d'un substrat le plus haut (104) d'une pluralité de substrats empilés les uns sur les autres, l'appareil de détection comprenant :

    une section d'illumination (108) configurée pour appliquer une illumination à une région couvrant une partie d'un bord du substrat le plus haut (104) et une partie d'un substrat inférieur (102) empilé avec le substrat le plus haut (104) ;

    une section d'obtention d'image (110) configurée pour obtenir au moins l'une d'une image de réflexion formée par réflexion depuis le substrat le plus haut (104) et d'une image de réflexion formée par réflexion depuis le substrat inférieur (102), et

    une section d'identification de position (120) configurée pour identifier une position du bord du substrat le plus haut (104) à l'aide de l'image obtenue par la section d'obtention d'image d'après une position d'une partie de type décrochement (E) présente dans la région en raison d'un décrochement formé entre le substrat le plus haut (104) et le substrat inférieur (102).


     
    20. Appareil de détection selon la revendication 19, dans lequel
    la section d'identification de position (120) identifie la partie de type décrochement (E) d'après l'image de réflexion formée par la réflexion depuis le substrat le plus haut (104) ou l'image de réflexion formée par la réflexion depuis le substrat inférieur (102).
     
    21. Appareil de détection selon la revendication 20, dans lequel
    au moins l'une (i) d'une direction dans laquelle la section d'illumination applique l'illumination à la région et (ii) d'une direction dans laquelle la section d'obtention d'image image la région forme un angle par rapport à l'orientation dans le plan du substrat le plus haut (104), et
    la section d'identification de position détecte la partie de type décrochement (E) en détectant une différence entre une partie illuminée du substrat le plus haut (104) par l'illumination appliquée et une partie illuminée du substrat inférieur (102) par l'illumination appliquée.
     
    22. Appareil de détection selon l'une des revendications 20 et 21, dans lequel
    la section d'identification de position détecte la partie de type décrochement (E) d'après une différence de luminance entre l'image de réflexion formée par la réflexion depuis le substrat le plus haut (104) et l'image de réflexion formée par la réflexion depuis le substrat inférieur (102).
     
    23. Appareil de détection selon la revendication 19, dans lequel
    la section d'illumination (108) applique l'illumination de manière à ce qu'un plan formé par une lumière incidente et une lumière réfléchie de l'illumination coupe une direction radiale du substrat le plus haut (104), lorsque le substrat le plus haut (104) est circulaire.
     
    24. Appareil de détection selon la revendication 20, dans lequel
    une pluralité des sections d'illumination (108) sont prévues pour appliquer une illumination à une pluralité de régions du substrat le plus haut (104), et une pluralité des sections d'obtention d'image (110) sont prévues en correspondance avec la pluralité de sections d'illumination (108).
     




    Drawing















































    Cited references

    REFERENCES CITED IN THE DESCRIPTION



    This list of references cited by the applicant is for the reader's convenience only. It does not form part of the European patent document. Even though great care has been taken in compiling the references, errors or omissions cannot be excluded and the EPO disclaims all liability in this regard.

    Patent documents cited in the description