[go: nahoru, domu]

US20010031068A1 - Target detection system using radar and image processing - Google Patents

Target detection system using radar and image processing Download PDF

Info

Publication number
US20010031068A1
US20010031068A1 US09/834,403 US83440301A US2001031068A1 US 20010031068 A1 US20010031068 A1 US 20010031068A1 US 83440301 A US83440301 A US 83440301A US 2001031068 A1 US2001031068 A1 US 2001031068A1
Authority
US
United States
Prior art keywords
image
edge data
area
flag
radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US09/834,403
Inventor
Akihiro Ohta
Kenji Oka
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Denso Ten Ltd
Original Assignee
Denso Ten Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2000118549A external-priority patent/JP4308405B2/en
Priority claimed from JP2000152695A external-priority patent/JP4311861B2/en
Application filed by Denso Ten Ltd filed Critical Denso Ten Ltd
Assigned to FUJITSU TEN LIMITED reassignment FUJITSU TEN LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: OHTA, AKIHIRO, OKA, KENJI
Publication of US20010031068A1 publication Critical patent/US20010031068A1/en
Priority to US11/188,160 priority Critical patent/US7376247B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • G01C3/08Use of electric radiation detectors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/97Determining parameters from multiple pictures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/12Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section

Definitions

  • the present invention relates to a target detection system.
  • the target detection system according to this invention is mounted on an automotive vehicle, for example, and is used to aid the driver in driving the vehicle by detecting a preceding vehicle running ahead of his vehicle or an obstacle lying ahead or the like target located ahead of the vehicle driven by the driver.
  • a conventional target detection system uses a fusion algorithm in which the reliability of data is studied by use of the result of detecting a target using a EHF radar and the result of detecting a target by image processing thereby to achieve the optimal result.
  • a target detection system comprises an EHF radar, a left camera, a right camera and an image processing ECU (electronic control unit).
  • the ECU includes an image processing microcomputer and a fusion processing microcomputer.
  • a specified area ahead is scanned by the extremely high-frequency wave.
  • the strength of the signal power output from the EHF radar and the range are so related to each other that the signal power strength is high for the portion where a target exists and low for the portion where a target does not exist.
  • the EHF radar can measure a far distance with high accuracy but is low in accuracy for the measurement of a near target. Also, the EHF radar outputs a near flag upon detection of a near target.
  • the image processing microcomputer extracts the edge of each of the two images acquired by the two cameras.
  • the edge positions of the two images are different due to the parallax error between the left and right cameras, and this difference is used to calculate the distance to the target.
  • Image processing can be used to measure the distance over a wide range but is low in accuracy for detection of a far target.
  • the distance measurement by image processing further has the following problems.
  • edges extraction processing is for simply extracting the edges from an image
  • the edges of letters written on the road surface, shadows or other objects not three-dimensional and different from the target may be extracted erroneously due to the density difference thereof. In such a case, edges are output in spite of the absence of a target.
  • FIG. 1 shows detection areas defined for the target detection system.
  • An area 2 in which a target can be detected by image processing has a large range, while an area 3 where a target can be detected by an EHF radar reaches a far distance.
  • an area 4 where a target can be detected by using both the image processing and the EHF radar on the other hand, a target can be recognized very reliably by the fusion processing between the output data of the radar and the output data of the image processing.
  • the area 4 is called the fusion area.
  • the microcomputer for fusion processing determines the presence or absence of a target based on an overall decision on both the result of detection by the EHF radar and the result of detection by the image processing microcomputer, and thus recognizes the presence of a target such as a preceding vehicle and calculates the distance, etc.
  • the processing time for the fusion algorithm in the fusion processing microcomputer is required in addition to the processing time for the EHF radar and the processing time for the image processing microcomputer, and therefore the total processing time is long. Also, the conventional target detection system has yet to overcome the disadvantages of both the EHF radar, and image processing, sufficiently.
  • An object of the present invention is to provide a target detection system using the EHF radar and image processing, wherein the processing time for detection is shortened while compensating for the disadvantages of the EHF radar and image processing with each other.
  • Another object of the invention is to provide a target detection system using both the EHF radar and the image processing, wherein the reliability is improved by preventing erroneous recognition and erroneous distance measurement in image recognition processing.
  • a target detection system comprising a radar, an image acquisition unit and a processing unit.
  • the processing unit specifies an area for image recognition based on the data output from the radar, and processes image data output from the image acquisition unit only within the specified area thereby to detect a target.
  • objects other than three-dimensional ones, including lines or letters on the road surface are not detected by the radar as a target, and therefore lines, letters and other auxiliary objects are not detected as a target by image processing.
  • the image data are processed only for an area where a target such as an obstacle or a vehicle is detected by the radar, and therefore the time required for processing the image data is shortened thereby shortening the processing time, as a whole, for target detection.
  • the image recognition area can be specified based on the power of the signal output from the radar.
  • the radar Upon detection of a target such as an obstacle or a vehicle, the radar outputs a signal of predetermined power.
  • a target is extracted only from an area having such a target by extracting the edge of the image data only for the particular area from which the signal power is detected.
  • the time required for image data processing can be shortened.
  • all the edges may be extracted from the image data and only the edges existing in the image recognition area may be processed as effective edges for target detection. In such a case, the time required for image processing is not shortened but the time required for fusion processing can be shortened.
  • the image recognition area can be determined based on the state of the near flag output from the radar.
  • the radar Upon detection of a near target, the radar outputs a near flag, the state of which changes with the distance to the target.
  • the edge data acquired in the image processing is selected in accordance with the presence or absence and the state of the near flag, and therefore the recognition error and the distance measurement error of the target can be prevented before the fusion processing.
  • a road surface flag and a letter flag can be attached to the edge data extracted by image processing in the case where a density difference on the image due to lines or letters on the road surface is detected.
  • For the edge data with the road surface flag or the letter flag it is determined whether the edge data including the particular road surface flag or the letter flag actually represents lines or characters written on the road surface. In the case where the edge data are found to represent lines or letters, the data in the particular area is invalidated. As a result, the recognition error in which lines or characters on the road surface are recognized as a target and the measurement error can be prevented before the fusion processing.
  • FIG. 1 shows different detection areas defined for a target detection system mounted on an automotive vehicle
  • FIG. 2 shows a basic configuration of a target detection system
  • FIG. 3 shows an image picked up as the condition ahead of the target detection system
  • FIG. 4 is a diagram showing the edges calculated by the image processing in the system of FIG. 2;
  • FIG. 5 is a diagram showing typical edges extracted by processing the edges of FIG. 4;
  • FIG. 6 is a diagram showing the data output from the EHF radar of FIG. 2;
  • FIG. 7 is a vehicle target detection system according to a first embodiment of the invention.
  • FIG. 8 is a flowchart showing the processing steps of the microcomputer of FIG. 7;
  • FIG. 9 shows a first method of determining a search area from the signal power strength obtained by the EHF radar of FIG. 7;
  • FIG. 10 shows edges extracted by the processing shown in FIG. 8;
  • FIG. 11 shows a second method for determining a search area from the strength of the signal power obtained by the EHF radar shown in FIG. 7;
  • FIG. 12 shows edges extracted by the method shown in FIG. 11;
  • FIG. 13 shows a third method for determining a search area from the strength of the signal power obtained by the EHF radar shown in FIG. 7;
  • FIG. 14 shows an image of a plurality of preceding vehicles
  • FIG. 15 is a flowchart showing the second processing of the microcomputer of FIG. 7;
  • FIG. 16 shows the strength of the power obtained by the EHF radar in the processing shown in FIG. 15;
  • FIG. 17 shows a first method for determining a search area in the process of FIG. 15;
  • FIG. 18 shows a second method for determining a search area in the process of FIG. 13;
  • FIG. 19 is a flowchart showing the third processing of the microcomputer of FIG. 7;
  • FIG. 20 is a vehicle target detection apparatus according to a second embodiment of the invention.
  • FIGS. 21A to 21 E are diagrams for explaining the processing in the image recognition unit of FIG. 20;
  • FIG. 22 shows a first specific circuit configuration of a target detection system according to a second embodiment
  • FIG. 23 is a flowchart showing the operation of the system of FIG. 22;
  • FIG. 24 shows a second specific circuit configuration of a target detection system according to the second embodiment
  • FIG. 25 is a flowchart showing the operation of the system of FIG. 24;
  • FIG. 26 shows a third specific circuit configuration of a target detection system according to the second embodiment
  • FIG. 27 shows a pattern matching area in FIG. 26
  • FIG. 28 is a flowchart showing the operation of the system of FIG. 27;
  • FIG. 29 shows a fourth specific circuit configuration of a vehicle target detection system according to the second embodiment
  • FIG. 30 is a flowchart showing the first operation of the system of FIG. 29.
  • FIG. 31 is a flowchart showing the second operation of the system of FIG. 29.
  • the target detection system comprises an EHF radar 11 , a left camera 12 , a right camera 3 and an image processing ECU 14 .
  • the ECU 14 is configured with an image processing microcomputer 21 and a fusion processing microcomputer 22 .
  • the image processing microcomputer 21 detects a target by processing the image data obtained from the cameras 12 , 13 . Specifically, edges are extracted from the images obtained from the left and right cameras 12 , 13 , and the parallax is calculated from the left and right positions of edge extraction thereby to calculate the distance value.
  • the processing operation of the image processing microcomputer 21 will be explained with reference to FIG. 3.
  • FIG. 3 shows a picked-up image of the condition ahead of the target detection system. Ahead of the vehicle, there is a preceding vehicle 6 , lines 7 are drawn on the road surface, and a guard rail 8 is present at a shoulder.
  • FIG. 4 shows the result of calculating the edges by processing the image of FIG. 3, and FIG. 5 the result of extracting the edges in the descending order of peak strength by processing the result of FIG. 4. In FIG. 5, the vertical short lines represent the edges extracted.
  • the image processing microcomputer 21 extracts the edges of FIG. 5 from the left and right cameras 12 , 13 and calculates the distance to the target using the parallax.
  • FIG. 6 shows the relation between the horizontal position (angle or range) of the scanned area and the power strength of the data output from the EHF radar 11 . It is seen that the power strength of the portion where the target is present is high, and vice versa.
  • the fusion processing microcomputer 22 determines whether a target is present or not by overall analysis of the detection result of the EHF radar 11 and the detection result of the image processing microcomputer 21 , and thereby checks the presence of a target such as a preceding vehicle and calculates the distance to the preceding vehicle.
  • FIG. 7 shows a vehicle target detection system according to a first embodiment of the invention.
  • a vehicle target detection system comprises an EHF radar 11 , a left camera 12 , a right camera 13 and an image processing ECU 14 .
  • the ECU 14 is configured with a microcomputer 15 having the dual functions of image processing and fusion processing. Although the two cameras 12 , 13 , left and right, are used for measuring the distance by parallax in image processing, only one camera will do in the case where the distance is not measured by parallax.
  • FIG. 8 is a flowchart showing the processing in the microcomputer 15 .
  • the condition ahead of the vehicle is assumed to be the same as shown in FIG. 3 and described above.
  • FIG. 8 the interior of a specified area is scanned by the EHF radar 11 in step S 1 .
  • FIG. 9 shows the result of scanning obtained from the EHF radar 11 .
  • the abscissa represents the horizontal position (angle) of the area scanned, and the ordinate the power strength in dB.
  • the signal power strength is high at the horizontal position corresponding to the preceding vehicle 6 .
  • step S 2 an object having a high signal power strength (not less than P dB) is recognized as a target, and the range (X 0 to X 1 ) where the target is present is held as a search area.
  • P dB signal power strength
  • the range (X 0 to X 1 ) where the target is present is held as a search area.
  • what is detected as a target is the preceding vehicle 6 alone. Power is not detected from a planar object like the lines 7 drawn on the road surface.
  • step S 3 the angular range (X 0 to X 1 ) obtained in step S 2 is defined as a search area in the image acquired from the left and right cameras 12 , 13 .
  • step S 4 edges are extracted in the search area thus defined.
  • the processing for extracting vertical edges is well known by those skilled in the art and therefore will not be described herein.
  • step S 5 only edges high in signal power strength are extracted from among the extracted edges.
  • the present embodiment is such that the vertical edges are extracted only for the search area but not over the whole image. The result is as shown in FIG. 10, in which the edges are represented by vertical short lines.
  • step S 6 the peaks in the left and right images are matched, and in step S 7 , the parallax for the corresponding peaks is calculated thereby to obtain the distance to the preceding vehicle 6 .
  • steps S 6 , S 7 is well known by those skilled in the art, and therefore will not be described.
  • the search area is defined for edge extraction by image processing and the processing time can be shortened. Also, objects such as lines or letters on the road surface are not reflected in the signal power of the EHF radar. Thus, only such objects such as an obstacle and a preceding vehicle can be detected.
  • Step S 2 of FIG. 8 The detection of the search area in Step S 2 of FIG. 8 can be variously modified.
  • FIG. 11 shows a different method of extracting the search area in step S 2 .
  • FIG. 12 shows the result of this edge extraction.
  • the range of first level P0 to second level P1 dB of the signal power strength obtained from the EHF radar 11 is defined as a predetermined level range of power strength, and the ranges X 0 to X 1 , X 2 to X 3 in the particular level range are extracted as a search area.
  • the portion of FIG. 11 where the power strength is high represents the detection of a target.
  • FIG. 13 shows another different method of extracting a search area in step S 2 of FIG. 8.
  • the distribution of the signal power strength obtained from the EHF radar 11 may be divided into a plurality of peaks as shown in FIG. 13. This phenomenon often occurs when two vehicles 6 , 9 are running ahead as shown in FIG. 14.
  • the horizontal positions X 0 to X 1 of the valley are extracted as a search area.
  • FIG. 15 is a flowchart showing the second process in the microcomputer 15 .
  • step S 11 the interior of a specified area is scanned by the EHF radar 11 .
  • FIG. 16 shows the result of scanning obtained from the EHF radar 11 . Assume that the condition ahead of the vehicle is the same as that shown in FIG. 2.
  • step S 12 an object with high signal power strength is recognized as a target, and the angle (X 0 ) corresponding to the peak of signal power strength in FIG. 16 is extracted and held.
  • steps S 13 , S 14 a search area is extracted based on the density change of the image.
  • FIG. 17 shows a density change of the image obtained from the cameras 12 , 13 . This density change represents the density of the image obtained from the cameras 12 , 13 , as expressed on a given horizontal line (X coordinate).
  • step S 13 an area of the density change laterally symmetric about the coordinate X 0 corresponding to a peak is searched for, and the positions X 1 , X 2 which have ceased to be symmetric are held.
  • the image density thereof is laterally symmetric about the center while, outside of the vehicle, the image of the road surface, etc. is detected and therefore is not laterally symmetric. This indicates that a target area is probably located in the neighborhood of the positions X 1 , X 2 where the lateral symmetry has disappeared. In view of the fact that the perfect lateral symmetry cannot be actually obtained even for the same target, however, a certain degree of allowance is given for a symmetry decision.
  • step S 14 the areas (X 3 to X 4 , X 5 to X 6 ) covering several neighboring coordinate points about the positions X 1 , X 2 are held as a search area. In this way, the area in the vicinity of the positions X 1 , X 2 is specified to show that the edges of a target are present in the particular search area.
  • steps S 4 to S 7 using this search area are similar to that in the flowchart of FIG. 8 described above. Also in this embodiment, the time required for image processing is shortened, and letters on the road surface are prevented from being detected erroneously as a target.
  • FIG. 18 shows a method of extracting a search area using the density projection value of an image.
  • the density projection value is obtained by totaling the pixel densities in vertical direction for the images obtained from the cameras 12 , 13 .
  • the search areas X 3 to X 4 , X 5 to X 6 are obtained in a similar manner to the aforementioned embodiment 2-1.
  • FIG. 19 is a flowchart showing the third process in the microcomputer 15 of FIG. 7.
  • step S 21 an image is acquired from the cameras 12 , 13 .
  • step S 22 edges are extracted by image processing.
  • edges are extracted over the entire range of the image, and therefore the result as shown in FIG. 4 described above is obtained.
  • step S 24 the interior of the specified area is scanned by the EHF radar 11 .
  • step S 25 the angular position Yn is extracted and held from the result of scanning. This angular position Yn is similar to the one extracted as a search area in embodiments described above, and any of the methods shown in FIGS. 9, 11 and 13 or a given combination thereof can be used.
  • step S 26 a portion shared by the angular positions Xn and Yn is extracted.
  • step S 27 the parallax is determined for the target at the common angular position extracted, and by converting it into a distance value, a target is detected.
  • the time required for image processing is not shortened, but the measurement error due to letters or other obstacles on the road surface can be eliminated.
  • FIG. 20 shows a target detection system for a vehicle according to a second embodiment of the invention.
  • This vehicle target detection system comprises an EHF radar 11 , a left camera 12 , a right camera 13 and an ECU 14 .
  • the ECU 14 includes an image recognition unit 25 for processing the images input from the two cameras 12 , 13 and outputting edge data and a processing unit 26 for detecting the presence of and measuring the distance to a target by fusion processing of the edge data input from the EHF radar 11 and the image recognition unit 25 .
  • the configuration described above is similar to that of the conventional target detection system. Unlike in the conventional target detection system in which the result is output unidirectionally only from the image recognition unit 25 to the processing unit 26 , however, the target detection system shown in FIG. 20 is different from the conventional target detection system in that bidirectional communication is sometimes established between the processing unit 26 and the image recognition unit 25 .
  • the EHF radar 11 radiates an EHF forward of the vehicle, and detects the presence of and the distance to a target based on the radio wave reflected from the target.
  • the EHF radar 11 which has a low accuracy of distance measurement for a near target, outputs a near flag upon detection of a near target.
  • the near flag is output in temporally stable state in the case where a target is located very near (not farther than 5 m, for example), and output intermittently in unstable state in the case where a near target is present (about 5 m to 10 m). In the case where a target is located far (not less than 10 m), on the other hand, no near flag is output.
  • the image recognition unit 25 extracts the edges of an input image (FIG. 21A) of the camera 12 . As a result, the edges shown in FIG. 21B are obtained. Then, from the result of this edge extraction, N (say, 16) edges are extracted in the descending order of strength (FIG. 21C).
  • a matching pattern 17 including M ⁇ M (say, 9 ⁇ 9) pixels is retrieved as shown in FIG. 21E, and the pattern matching is effected for the input image (FIG. 21D) from the other camera 13 thereby to detect corresponding edges. From the parallax between the two edges, the distance to each edge is calculated and the result is output to the processing unit 26 as edge data.
  • the image recognition unit 25 may erroneously output a distance by extracting also the edges for the density difference of white lines and other objects other than the target which are not three-dimensional. Also, the distance may be erroneously measured by a mis-operation in the case where the matching area happens to include a pattern similar to the pattern 17 as large as M ⁇ M pixels used for pattern matching as shown in FIG. 21E.
  • the near flag output from the EHF radar 11 and the letter flag and the road surface flag output from the image recognition unit 25 are used so that the recognition error and the distance measurement error of the image recognition system for the fusion area 4 (FIG. 1) are prevented before the fusion processing in the processing unit 26 .
  • FIG. 22 shows a first specific configuration of a vehicle target detection system. The component parts that have already been explained with reference to FIG. 20 will not be explained again.
  • the pre-processing unit 28 of the processing unit 26 selects the edge data by the near flag output from the EHF radar 11 .
  • the edge data determined as effective are employed and output to the fusion processing unit 29 .
  • the image recognition unit 25 is supplied with images from the cameras 12 , 13 (step S 31 ) and extracts the edges from one of the images (step S 32 ). From the edges thus extracted, a predetermined number of edges having a strong peak are extracted (step S 33 ). The pattern matching for the other image is carried out for each edge (step S 34 ) thereby to measure the distance (step S 35 ).
  • the pre-processing unit 28 of the processing unit 26 determines whether the near flag is output from the EHF radar 11 (step S 36 ), and if any is output, determines whether the near flag is output in stable fashion (step S 37 ).
  • step S 38 In the case where it is determined that the near flag is output in stable fashion (continuously temporally), it is determined that a target is present at a very near distance (say, 0 to 5 m), and the edge data having distance information of a very near distance (say, not more than 5 m) is employed (step S 38 ). In the case where it is determined that the near flag is output in unstable fashion (intermittently), on the other hand, it is determined that a target is located at a near distance (say, 5 to 10 m), and the edge is employed which has distance information on a near distance (say, 5 to 10 m) (step S 39 ).
  • the near flag is not output, it is determined that a target is located far (say, not less than 10 m), so that the edges having far distance (say, not less than 10 m) information in the fusion area 4 are employed (step S 40 ).
  • the fusion processing is executed based on the edge data employed and the data output from the EHF radar 11 thereby to recognize the presence of a target and measure the distance to the target (step S 41 ), followed by outputting the result (step S 42 ).
  • the particular edge data is eliminated unless a target is detected by the EHF radar 11 in the area of erroneous distance measurement.
  • erroneous recognition or erroneous distance measurement for the target can be prevented.
  • invalid edge data is removed before the fusion processing and, therefore, the processing time can be shortened.
  • FIG. 24 shows a second specific circuit configuration of a vehicle target detection system according to a second embodiment. The component parts already explained will not be explained again.
  • the continuity determination unit 30 of the processing unit 26 determines the state of the near flag output from the EHF radar 11 , and the resulting data is sent to the invalid edge removing unit 31 of the image recognition unit 25 .
  • invalid edge data are removed in accordance with the condition of the near flag and the edge data is output to the fusion processing unit 29 .
  • the image recognition unit 25 as in steps S 31 to S 33 in the embodiment 2-1 described above, the image is input (step S 51 ), the edges are extracted (step S 52 ) and the peak is extracted (step S 53 ).
  • the image recognition unit 25 conducts pattern matching using the edge data not removed (step S 54 ) and measures the distance (step S 55 ).
  • the continuity determination unit 30 determines whether the near flag is output or not from the EHF radar 11 (step S 56 ) and also whether the near flag is in stable state or not (step S 57 ), the result thereof being output to the invalid edge removing unit 31 .
  • the invalid edge removing unit 31 removes the edge data having other than the very near distance information (step S 58 ). Upon receipt of the data indicating that the near flag is output in unstable fashion, on the other hand, the edges having other than the near distance information are removed (step S 59 ). Further, in the case where no near flag is output, the edge data having other than far distance information are removed (step S 60 ).
  • the resulting edge data is output to the fusion processing unit 29 .
  • the fusion processing is carried out (step S 61 ) and the result is output (step S 62 ).
  • This embodiment also produces the same effect as the embodiment 2-1 described above.
  • FIG. 26 shows a third specific circuit configuration of a vehicle target detection system according to a second embodiment. The component parts already explained will not be explained again.
  • the continuity determination unit 30 of the processing unit 26 determines the state of the near flag output from the EHF radar 11 , and sends the result data to the image recognition unit 25 .
  • an area priority setting unit 32 determines the order of priority of the pattern matching areas corresponding to the input result data, and performs the pattern matching for the selected area in priority.
  • FIG. 27 shows areas for which the pattern matching is conducted.
  • a matching pattern corresponding to the edge portion is taken out and, as shown in FIG. 21D, the pattern matching is carried out for the other image.
  • the order of priority of areas is determined according to the edge extraction position.
  • the image recognition unit 25 upon receipt of the data indicating that a near flag is stably output, performs the pattern matching for the area of the 26th to 80th pixels from the edge extraction position as a very near area in priority over the other areas. Upon receipt of the data indicating that the near flag is output in an unstable fashion, on the other hand, the image recognition unit 25 performs the pattern matching for the area of the 10th to 25th pixels, for example, in priority as a near area. Further, upon receipt of the data indicating that no near flag is output, the image recognition unit 25 performs the pattern matching for the area of the 0th to the 9th pixels, for example, in priority as a far area.
  • an image is input (step S 71 ), edges are extracted (step S 72 ) and a peak is extracted (step S 73 ), and the continuity determination unit 30 determines whether the near flag is output or not (step S 74 ) and whether the near flag is stable or not (step S 75 ). The result is output to the edge priority setting unit 32 .
  • the edge priority setting unit 32 gives the priority to the very near distance for the pattern matching area (step S 76 ).
  • the near distance is given priority (step S 77 ).
  • the far distance is given priority (step S 78 ).
  • the image recognition unit 25 performs the pattern matching (step S 79 ) and measures the distance (step S 80 ) for the area given priority.
  • the resulting edge data is output to the fusion processing unit 29 .
  • step S 81 the fusion processing is carried out (step S 81 ) and the result is output (step S 82 ).
  • the pattern matching is started from the area mostly likely to match, and therefore the time until successful matching is shortened. Also, the possibility of handling a similar matching pattern is reduced thereby to prevent the erroneous distance measurement.
  • FIG. 29 shows a fourth specific circuit configuration of a vehicle target detection system according to the second embodiment. The component parts already explained will not be explained again.
  • a road surface/letter edge determination unit 33 of the image recognition unit 25 determines whether an extracted edge represents a line or a letter on the road surface or not, and outputs the result to the invalid edge removing unit 34 of the processing unit 26 .
  • the invalid edge removing unit 34 removes the invalid edges from the edge data input thereto from the image recognition unit 25 , and outputs the remaining edge data to the fusion processing unit 29 .
  • the edges are extracted according to the density difference on the image.
  • the edges of the letters and shadows on the road surface, though not a target, are extracted undesirably according to the density difference.
  • the road surface/letter edge determination unit 33 determines whether the density difference on the road surface or a target is involved or not, based on the distance information and height information on the density difference extracted. In the case where it is determined that the density difference is that on the road surface, the edge data corresponding to the particular density difference with the road surface flag attached thereto is output to the invalid edge removing unit 34 .
  • the letters written on the road surface change from the road surface color to white or yellow or from white or yellow to the road surface color in the vicinity of the edge thereof.
  • the road surface/letter edge determination unit 33 determines that the road surface letters are detected, using the density information in the neighborhood of the extracted edge. Upon determination that the road surface letters are involved, the road surface letter determination unit 33 outputs the edge with a letter flag attached thereto to the invalid edge removing unit 34 .
  • the invalid edge removing unit 34 determines whether there is a near flag output from the EHF radar 11 . Unless the near flag is output, the particular edge is determined as the density difference or the letters on the road surface and removed, while the remaining edge data are output to the fusion processing unit 26 .
  • step S 91 an image is input (step S 91 ), edges are extracted (step S 92 ), a peak is extracted (step S 93 ), the pattern matching is carried out (step S 94 ), and the distance is measured (step S 95 ).
  • step S 96 the road surface flag or the letter flag is attached to a predetermined edge data.
  • the invalid edge removing unit 34 determines whether the road surface flag or the letter flag exists or not (step S 97 ), determines whether the edge distance information indicates a near distance or not (step S 98 ), and determines whether the near flag is output or not from the EHF radar 11 (step S 99 ). In the case where the road surface flag or the letter flag is attached, the edge distance information indicates the near distance and the near flag is not output, then the edge data of the road surface flag or the letter flag, as the case may be, is removed (step S 100 ), and the remaining edge data is delivered to the fusion processing unit 29 .
  • step S 101 the fusion processing is carried out (step S 101 ) and the result is output (step S 102 ).
  • the road surface/letter edge determination unit 33 may output only the road surface flag from the distance and height of the density difference of the road surface or, conversely, may output only the letter flag from the change in the density difference of the road surface.
  • the process can be changed as shown in the flowchart of FIG. 31.
  • the invalid edge removing unit 34 determines whether the road surface flag or the letter flag is attached to the edge data or not and also determines in step S 981 whether the distance information indicates a far distance (say, not less than 10 m). In the case where the distance information indicates a far distance, it is determined in step S 991 whether the distance data output from the EHF radar 11 is within the allowable error range of the distance information of the edge data. In the case where it is not within the allowable error range, the edge data to which the road surface flag or the letter flag is attached is removed in step S 100 .
  • the erroneous recognition and the erroneous distance measurement in the image recognition system can be prevented before the fusion processing by use of the letter flag and the road surface flag of the image recognition system.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Traffic Control Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

A target detection system using an EHF radar and the image processing is disclosed, in which the processing time is shortened by mutually complementing the disadvantages of the EHF radar and the image processing thereby to improve the reliability. The system comprises a radar, an image acquisition unit and an image processing ECU. The microcomputer of the ECU specifies an image recognition area based on the power output from the radar, and carries out the image processing only within the specified recognition area for the image obtained from the image acquisition unit. By performing the image processing only for the area where a target is detected by the radar, the time required for image processing is shortened on the one hand and the erroneous detection of letters on the road surface or the like is eliminated.

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention [0001]
  • The present invention relates to a target detection system. The target detection system according to this invention is mounted on an automotive vehicle, for example, and is used to aid the driver in driving the vehicle by detecting a preceding vehicle running ahead of his vehicle or an obstacle lying ahead or the like target located ahead of the vehicle driven by the driver. [0002]
  • 2. Description of the Related Art [0003]
  • A conventional target detection system uses a fusion algorithm in which the reliability of data is studied by use of the result of detecting a target using a EHF radar and the result of detecting a target by image processing thereby to achieve the optimal result. Such a target detection system comprises an EHF radar, a left camera, a right camera and an image processing ECU (electronic control unit). The ECU includes an image processing microcomputer and a fusion processing microcomputer. [0004]
  • In the processing method using the EHF radar, a specified area ahead is scanned by the extremely high-frequency wave. The strength of the signal power output from the EHF radar and the range are so related to each other that the signal power strength is high for the portion where a target exists and low for the portion where a target does not exist. The EHF radar can measure a far distance with high accuracy but is low in accuracy for the measurement of a near target. Also, the EHF radar outputs a near flag upon detection of a near target. [0005]
  • The image processing microcomputer extracts the edge of each of the two images acquired by the two cameras. The edge positions of the two images are different due to the parallax error between the left and right cameras, and this difference is used to calculate the distance to the target. Image processing can be used to measure the distance over a wide range but is low in accuracy for detection of a far target. [0006]
  • The distance measurement by image processing further has the following problems. [0007]
  • 1. (Erroneous recognition) In view of the fact that the edge extraction processing is for simply extracting the edges from an image, the edges of letters written on the road surface, shadows or other objects not three-dimensional and different from the target may be extracted erroneously due to the density difference thereof. In such a case, edges are output in spite of the absence of a target. [0008]
  • 2. (Erroneous distance measurement) In the case where an edge is detected by the edge extraction processing, the distance is measured by pattern matching between the images acquired by the two cameras. In this processing, the result may become erroneous in the case where a similar pattern happens to exist. [0009]
  • FIG. 1 shows detection areas defined for the target detection system. [0010]
  • An [0011] area 2 in which a target can be detected by image processing has a large range, while an area 3 where a target can be detected by an EHF radar reaches a far distance. In an area 4 where a target can be detected by using both the image processing and the EHF radar, on the other hand, a target can be recognized very reliably by the fusion processing between the output data of the radar and the output data of the image processing. The area 4 is called the fusion area. The microcomputer for fusion processing determines the presence or absence of a target based on an overall decision on both the result of detection by the EHF radar and the result of detection by the image processing microcomputer, and thus recognizes the presence of a target such as a preceding vehicle and calculates the distance, etc.
  • In the conventional target detection system, the processing time for the fusion algorithm in the fusion processing microcomputer is required in addition to the processing time for the EHF radar and the processing time for the image processing microcomputer, and therefore the total processing time is long. Also, the conventional target detection system has yet to overcome the disadvantages of both the EHF radar, and image processing, sufficiently. [0012]
  • SUMMARY OF THE INVENTION
  • An object of the present invention is to provide a target detection system using the EHF radar and image processing, wherein the processing time for detection is shortened while compensating for the disadvantages of the EHF radar and image processing with each other. [0013]
  • Another object of the invention is to provide a target detection system using both the EHF radar and the image processing, wherein the reliability is improved by preventing erroneous recognition and erroneous distance measurement in image recognition processing. [0014]
  • The present invention has been developed in order to achieve the objects described above. According to one aspect of the invention, there is provided a target detection system comprising a radar, an image acquisition unit and a processing unit. The processing unit specifies an area for image recognition based on the data output from the radar, and processes image data output from the image acquisition unit only within the specified area thereby to detect a target. According to this invention, objects other than three-dimensional ones, including lines or letters on the road surface are not detected by the radar as a target, and therefore lines, letters and other auxiliary objects are not detected as a target by image processing. Also, the image data are processed only for an area where a target such as an obstacle or a vehicle is detected by the radar, and therefore the time required for processing the image data is shortened thereby shortening the processing time, as a whole, for target detection. [0015]
  • In the target detection system according to this invention, the image recognition area can be specified based on the power of the signal output from the radar. Upon detection of a target such as an obstacle or a vehicle, the radar outputs a signal of predetermined power. A target is extracted only from an area having such a target by extracting the edge of the image data only for the particular area from which the signal power is detected. As a result, the time required for image data processing can be shortened. By the way, all the edges may be extracted from the image data and only the edges existing in the image recognition area may be processed as effective edges for target detection. In such a case, the time required for image processing is not shortened but the time required for fusion processing can be shortened. [0016]
  • In the target detection system according to this invention, the image recognition area can be determined based on the state of the near flag output from the radar. Upon detection of a near target, the radar outputs a near flag, the state of which changes with the distance to the target. In the processing for target detection, the edge data acquired in the image processing is selected in accordance with the presence or absence and the state of the near flag, and therefore the recognition error and the distance measurement error of the target can be prevented before the fusion processing. [0017]
  • Further, in the target detection system according to this invention, a road surface flag and a letter flag can be attached to the edge data extracted by image processing in the case where a density difference on the image due to lines or letters on the road surface is detected. For the edge data with the road surface flag or the letter flag, it is determined whether the edge data including the particular road surface flag or the letter flag actually represents lines or characters written on the road surface. In the case where the edge data are found to represent lines or letters, the data in the particular area is invalidated. As a result, the recognition error in which lines or characters on the road surface are recognized as a target and the measurement error can be prevented before the fusion processing.[0018]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above object and features of the present invention will be more apparent from the following description of the preferred embodiment with reference to the accompanying drawings, wherein: [0019]
  • FIG. 1 shows different detection areas defined for a target detection system mounted on an automotive vehicle; [0020]
  • FIG. 2 shows a basic configuration of a target detection system; [0021]
  • FIG. 3 shows an image picked up as the condition ahead of the target detection system; [0022]
  • FIG. 4 is a diagram showing the edges calculated by the image processing in the system of FIG. 2; [0023]
  • FIG. 5 is a diagram showing typical edges extracted by processing the edges of FIG. 4; [0024]
  • FIG. 6 is a diagram showing the data output from the EHF radar of FIG. 2; [0025]
  • FIG. 7 is a vehicle target detection system according to a first embodiment of the invention; [0026]
  • FIG. 8 is a flowchart showing the processing steps of the microcomputer of FIG. 7; [0027]
  • FIG. 9 shows a first method of determining a search area from the signal power strength obtained by the EHF radar of FIG. 7; [0028]
  • FIG. 10 shows edges extracted by the processing shown in FIG. 8; [0029]
  • FIG. 11 shows a second method for determining a search area from the strength of the signal power obtained by the EHF radar shown in FIG. 7; [0030]
  • FIG. 12 shows edges extracted by the method shown in FIG. 11; [0031]
  • FIG. 13 shows a third method for determining a search area from the strength of the signal power obtained by the EHF radar shown in FIG. 7; [0032]
  • FIG. 14 shows an image of a plurality of preceding vehicles; [0033]
  • FIG. 15 is a flowchart showing the second processing of the microcomputer of FIG. 7; [0034]
  • FIG. 16 shows the strength of the power obtained by the EHF radar in the processing shown in FIG. 15; [0035]
  • FIG. 17 shows a first method for determining a search area in the process of FIG. 15; [0036]
  • FIG. 18 shows a second method for determining a search area in the process of FIG. 13; [0037]
  • FIG. 19 is a flowchart showing the third processing of the microcomputer of FIG. 7; [0038]
  • FIG. 20 is a vehicle target detection apparatus according to a second embodiment of the invention; [0039]
  • FIGS. 21A to [0040] 21E are diagrams for explaining the processing in the image recognition unit of FIG. 20;
  • FIG. 22 shows a first specific circuit configuration of a target detection system according to a second embodiment; [0041]
  • FIG. 23 is a flowchart showing the operation of the system of FIG. 22; [0042]
  • FIG. 24 shows a second specific circuit configuration of a target detection system according to the second embodiment; [0043]
  • FIG. 25 is a flowchart showing the operation of the system of FIG. 24; [0044]
  • FIG. 26 shows a third specific circuit configuration of a target detection system according to the second embodiment; [0045]
  • FIG. 27 shows a pattern matching area in FIG. 26; [0046]
  • FIG. 28 is a flowchart showing the operation of the system of FIG. 27; [0047]
  • FIG. 29 shows a fourth specific circuit configuration of a vehicle target detection system according to the second embodiment; [0048]
  • FIG. 30 is a flowchart showing the first operation of the system of FIG. 29; and [0049]
  • FIG. 31 is a flowchart showing the second operation of the system of FIG. 29.[0050]
  • DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • First, an explanation will be given of the principle of fusion processing for target detection which is used in a target detection system according to this invention. [0051]
  • As shown in FIG. 2, the target detection system comprises an [0052] EHF radar 11, a left camera 12, a right camera 3 and an image processing ECU 14. The ECU 14 is configured with an image processing microcomputer 21 and a fusion processing microcomputer 22. The image processing microcomputer 21 detects a target by processing the image data obtained from the cameras 12, 13. Specifically, edges are extracted from the images obtained from the left and right cameras 12, 13, and the parallax is calculated from the left and right positions of edge extraction thereby to calculate the distance value. The processing operation of the image processing microcomputer 21 will be explained with reference to FIG. 3.
  • FIG. 3 shows a picked-up image of the condition ahead of the target detection system. Ahead of the vehicle, there is a preceding [0053] vehicle 6, lines 7 are drawn on the road surface, and a guard rail 8 is present at a shoulder. FIG. 4 shows the result of calculating the edges by processing the image of FIG. 3, and FIG. 5 the result of extracting the edges in the descending order of peak strength by processing the result of FIG. 4. In FIG. 5, the vertical short lines represent the edges extracted. The image processing microcomputer 21 extracts the edges of FIG. 5 from the left and right cameras 12, 13 and calculates the distance to the target using the parallax.
  • In the processing method of the [0054] EHF radar 11, the interior of a specified area is scanned by the EHF radar and the portion of the output data having strong power is recognized as a target. FIG. 6 shows the relation between the horizontal position (angle or range) of the scanned area and the power strength of the data output from the EHF radar 11. It is seen that the power strength of the portion where the target is present is high, and vice versa.
  • The [0055] fusion processing microcomputer 22 determines whether a target is present or not by overall analysis of the detection result of the EHF radar 11 and the detection result of the image processing microcomputer 21, and thereby checks the presence of a target such as a preceding vehicle and calculates the distance to the preceding vehicle.
  • Embodiment 1
  • FIG. 7 shows a vehicle target detection system according to a first embodiment of the invention. [0056]
  • A vehicle target detection system comprises an [0057] EHF radar 11, a left camera 12, a right camera 13 and an image processing ECU 14. The ECU 14 is configured with a microcomputer 15 having the dual functions of image processing and fusion processing. Although the two cameras 12, 13, left and right, are used for measuring the distance by parallax in image processing, only one camera will do in the case where the distance is not measured by parallax.
  • Now, the processing in the [0058] microcomputer 15 will be explained.
  • Embodiment 1-1
  • FIG. 8 is a flowchart showing the processing in the [0059] microcomputer 15. The condition ahead of the vehicle is assumed to be the same as shown in FIG. 3 and described above.
  • In FIG. 8, the interior of a specified area is scanned by the [0060] EHF radar 11 in step S1. FIG. 9 shows the result of scanning obtained from the EHF radar 11. In FIG. 9, the abscissa represents the horizontal position (angle) of the area scanned, and the ordinate the power strength in dB. In the case where a preceding vehicle 6 is present, as shown, the signal power strength is high at the horizontal position corresponding to the preceding vehicle 6.
  • In step S[0061] 2, an object having a high signal power strength (not less than P dB) is recognized as a target, and the range (X0 to X1) where the target is present is held as a search area. In the case shown in FIG. 3, what is detected as a target is the preceding vehicle 6 alone. Power is not detected from a planar object like the lines 7 drawn on the road surface.
  • In step S[0062] 3, the angular range (X0 to X1) obtained in step S2 is defined as a search area in the image acquired from the left and right cameras 12, 13.
  • In step S[0063] 4, edges are extracted in the search area thus defined. The processing for extracting vertical edges is well known by those skilled in the art and therefore will not be described herein.
  • In step S[0064] 5, only edges high in signal power strength are extracted from among the extracted edges. Unlike FIG. 4 showing the result of edge extraction over all the images obtained from the cameras 12, 13, the present embodiment is such that the vertical edges are extracted only for the search area but not over the whole image. The result is as shown in FIG. 10, in which the edges are represented by vertical short lines.
  • By extracting the vertical edges only within the specified search area in this way, the processing time can be shortened as compared with the case where a target is detected based on the edges of the whole image. Also, the edges are extracted for [0065] line 7, etc. (FIG. 3) not included in the search area of FIG. 9, and therefore lines or letters written on the road surface are not erroneously detected as a target.
  • In step S[0066] 6, the peaks in the left and right images are matched, and in step S7, the parallax for the corresponding peaks is calculated thereby to obtain the distance to the preceding vehicle 6. The process of steps S6, S7 is well known by those skilled in the art, and therefore will not be described.
  • In the example described above, the search area is defined for edge extraction by image processing and the processing time can be shortened. Also, objects such as lines or letters on the road surface are not reflected in the signal power of the EHF radar. Thus, only such objects such as an obstacle and a preceding vehicle can be detected. [0067]
  • Embodiment 1-2
  • The detection of the search area in Step S[0068] 2 of FIG. 8 can be variously modified.
  • FIG. 11 shows a different method of extracting the search area in step S[0069] 2. FIG. 12 shows the result of this edge extraction. In FIG. 11, the range of first level P0 to second level P1 dB of the signal power strength obtained from the EHF radar 11 is defined as a predetermined level range of power strength, and the ranges X0 to X1, X2 to X3 in the particular level range are extracted as a search area. The portion of FIG. 11 where the power strength is high represents the detection of a target. The range of P0 to P1 dB where the signal power strength changes sharply represents the edge position of the target. According to this embodiment, therefore, the position where the edges can probably be extracted can be further limited, and therefore, as shown in FIG. 12, only the edges of the target can be extracted, thereby further shortening the processing time.
  • Embodiment 1-)
  • FIG. 13 shows another different method of extracting a search area in step S[0070] 2 of FIG. 8. The distribution of the signal power strength obtained from the EHF radar 11 may be divided into a plurality of peaks as shown in FIG. 13. This phenomenon often occurs when two vehicles 6, 9 are running ahead as shown in FIG. 14. In the case where the power distribution is divided into two peaks as described above, the horizontal positions X0 to X1 of the valley (the portion where the signal power strength is not more than P1 dB) are extracted as a search area.
  • Embodiment 1-4
  • When a vehicle is actually running on a toll road or a free way, the possibility of presence of a single preceding vehicle is very low and a plurality of vehicles are running ahead in almost all cases. Therefore, various patterns are obtained in the result of output from the EHF radar and it is impossible to determine a pattern uniquely. In view of this, the actual driving requirement is met by assuming that the total of all the search areas described in embodiments 1-1 to 1-3 constitute a search area. [0071]
  • Embodiment 1-5
  • FIG. 15 is a flowchart showing the second process in the [0072] microcomputer 15.
  • In step S[0073] 11, the interior of a specified area is scanned by the EHF radar 11. FIG. 16 shows the result of scanning obtained from the EHF radar 11. Assume that the condition ahead of the vehicle is the same as that shown in FIG. 2.
  • In step S[0074] 12, an object with high signal power strength is recognized as a target, and the angle (X0) corresponding to the peak of signal power strength in FIG. 16 is extracted and held.
  • In steps S[0075] 13, S14, a search area is extracted based on the density change of the image. FIG. 17 shows a density change of the image obtained from the cameras 12, 13. This density change represents the density of the image obtained from the cameras 12, 13, as expressed on a given horizontal line (X coordinate).
  • In step S[0076] 13, an area of the density change laterally symmetric about the coordinate X0 corresponding to a peak is searched for, and the positions X1, X2 which have ceased to be symmetric are held. In the case where the target is a vehicle, the image density thereof is laterally symmetric about the center while, outside of the vehicle, the image of the road surface, etc. is detected and therefore is not laterally symmetric. This indicates that a target area is probably located in the neighborhood of the positions X1, X2 where the lateral symmetry has disappeared. In view of the fact that the perfect lateral symmetry cannot be actually obtained even for the same target, however, a certain degree of allowance is given for a symmetry decision.
  • In step S[0077] 14, the areas (X3 to X4, X5 to X6) covering several neighboring coordinate points about the positions X1, X2 are held as a search area. In this way, the area in the vicinity of the positions X1, X2 is specified to show that the edges of a target are present in the particular search area.
  • The processes in subsequent steps, i.e. steps S[0078] 4 to S7 using this search area are similar to that in the flowchart of FIG. 8 described above. Also in this embodiment, the time required for image processing is shortened, and letters on the road surface are prevented from being detected erroneously as a target.
  • Embodiment 1-6
  • The aforementioned extraction of a search area by image processing in steps S[0079] 13, S14 described above can use a density projection value instead of a density change of the image.
  • FIG. 18 shows a method of extracting a search area using the density projection value of an image. The density projection value is obtained by totaling the pixel densities in vertical direction for the images obtained from the [0080] cameras 12, 13. In this embodiment, too, the search areas X3 to X4, X5 to X6 are obtained in a similar manner to the aforementioned embodiment 2-1.
  • Embodiment 1-7
  • FIG. 19 is a flowchart showing the third process in the [0081] microcomputer 15 of FIG. 7. In step S21, an image is acquired from the cameras 12, 13.
  • In step S[0082] 22, edges are extracted by image processing. In this image processing, edges are extracted over the entire range of the image, and therefore the result as shown in FIG. 4 described above is obtained.
  • From the edges obtained, peaks are extracted in the descending order of power strength in step S[0083] 23. The result is as shown in FIG. 5. The extraction position for each edge is held as an angular position Xn.
  • In step S[0084] 24, the interior of the specified area is scanned by the EHF radar 11. In step S25, the angular position Yn is extracted and held from the result of scanning. This angular position Yn is similar to the one extracted as a search area in embodiments described above, and any of the methods shown in FIGS. 9, 11 and 13 or a given combination thereof can be used.
  • In step S[0085] 26, a portion shared by the angular positions Xn and Yn is extracted. In step S27, the parallax is determined for the target at the common angular position extracted, and by converting it into a distance value, a target is detected.
  • In embodiment 1-7, the time required for image processing is not shortened, but the measurement error due to letters or other obstacles on the road surface can be eliminated. [0086]
  • Embodiment 2
  • FIG. 20 shows a target detection system for a vehicle according to a second embodiment of the invention. [0087]
  • This vehicle target detection system comprises an [0088] EHF radar 11, a left camera 12, a right camera 13 and an ECU 14. The ECU 14 includes an image recognition unit 25 for processing the images input from the two cameras 12, 13 and outputting edge data and a processing unit 26 for detecting the presence of and measuring the distance to a target by fusion processing of the edge data input from the EHF radar 11 and the image recognition unit 25.
  • The configuration described above is similar to that of the conventional target detection system. Unlike in the conventional target detection system in which the result is output unidirectionally only from the [0089] image recognition unit 25 to the processing unit 26, however, the target detection system shown in FIG. 20 is different from the conventional target detection system in that bidirectional communication is sometimes established between the processing unit 26 and the image recognition unit 25.
  • The [0090] EHF radar 11 radiates an EHF forward of the vehicle, and detects the presence of and the distance to a target based on the radio wave reflected from the target. The EHF radar 11, which has a low accuracy of distance measurement for a near target, outputs a near flag upon detection of a near target. The near flag is output in temporally stable state in the case where a target is located very near (not farther than 5 m, for example), and output intermittently in unstable state in the case where a near target is present (about 5 m to 10 m). In the case where a target is located far (not less than 10 m), on the other hand, no near flag is output.
  • The processing in the [0091] image recognition unit 25 will be explained with reference to FIGS. 21A to 21E. First, the image recognition unit 25 extracts the edges of an input image (FIG. 21A) of the camera 12. As a result, the edges shown in FIG. 21B are obtained. Then, from the result of this edge extraction, N (say, 16) edges are extracted in the descending order of strength (FIG. 21C).
  • From each of the N edges, a [0092] matching pattern 17 including M×M (say, 9×9) pixels is retrieved as shown in FIG. 21E, and the pattern matching is effected for the input image (FIG. 21D) from the other camera 13 thereby to detect corresponding edges. From the parallax between the two edges, the distance to each edge is calculated and the result is output to the processing unit 26 as edge data.
  • As shown also in FIG. 21B, the [0093] image recognition unit 25 may erroneously output a distance by extracting also the edges for the density difference of white lines and other objects other than the target which are not three-dimensional. Also, the distance may be erroneously measured by a mis-operation in the case where the matching area happens to include a pattern similar to the pattern 17 as large as M×M pixels used for pattern matching as shown in FIG. 21E.
  • In view of this, according to this invention, the near flag output from the [0094] EHF radar 11 and the letter flag and the road surface flag output from the image recognition unit 25 are used so that the recognition error and the distance measurement error of the image recognition system for the fusion area 4 (FIG. 1) are prevented before the fusion processing in the processing unit 26.
  • Embodiment 2-1
  • FIG. 22 shows a first specific configuration of a vehicle target detection system. The component parts that have already been explained with reference to FIG. 20 will not be explained again. [0095]
  • When edge data is output from the [0096] image recognition unit 25, the pre-processing unit 28 of the processing unit 26 selects the edge data by the near flag output from the EHF radar 11. The edge data determined as effective are employed and output to the fusion processing unit 29.
  • This processing will be explained in detail with reference to the flowchart of FIG. 23. [0097]
  • The [0098] image recognition unit 25 is supplied with images from the cameras 12, 13 (step S31) and extracts the edges from one of the images (step S32). From the edges thus extracted, a predetermined number of edges having a strong peak are extracted (step S33). The pattern matching for the other image is carried out for each edge (step S34) thereby to measure the distance (step S35).
  • The [0099] pre-processing unit 28 of the processing unit 26 determines whether the near flag is output from the EHF radar 11 (step S36), and if any is output, determines whether the near flag is output in stable fashion (step S37).
  • In the case where it is determined that the near flag is output in stable fashion (continuously temporally), it is determined that a target is present at a very near distance (say, 0 to 5 m), and the edge data having distance information of a very near distance (say, not more than 5 m) is employed (step S[0100] 38). In the case where it is determined that the near flag is output in unstable fashion (intermittently), on the other hand, it is determined that a target is located at a near distance (say, 5 to 10 m), and the edge is employed which has distance information on a near distance (say, 5 to 10 m) (step S39). Further, in the case where the near flag is not output, it is determined that a target is located far (say, not less than 10 m), so that the edges having far distance (say, not less than 10 m) information in the fusion area 4 are employed (step S40).
  • In the [0101] fusion processing unit 29, the fusion processing is executed based on the edge data employed and the data output from the EHF radar 11 thereby to recognize the presence of a target and measure the distance to the target (step S41), followed by outputting the result (step S42).
  • According to this embodiment, even in the case where the edge data is recognized erroneously or the distance is measured erroneously by the [0102] image recognition unit 25, the particular edge data is eliminated unless a target is detected by the EHF radar 11 in the area of erroneous distance measurement. Thus, erroneous recognition or erroneous distance measurement for the target can be prevented. Also, invalid edge data is removed before the fusion processing and, therefore, the processing time can be shortened.
  • Embodiment 2-2
  • FIG. 24 shows a second specific circuit configuration of a vehicle target detection system according to a second embodiment. The component parts already explained will not be explained again. [0103]
  • The [0104] continuity determination unit 30 of the processing unit 26 determines the state of the near flag output from the EHF radar 11, and the resulting data is sent to the invalid edge removing unit 31 of the image recognition unit 25. In the invalid edge removing unit 31, invalid edge data are removed in accordance with the condition of the near flag and the edge data is output to the fusion processing unit 29.
  • The aforementioned process will be explained in detail with reference to the flowchart of FIG. 25. [0105]
  • In the [0106] image recognition unit 25, as in steps S31 to S33 in the embodiment 2-1 described above, the image is input (step S51), the edges are extracted (step S52) and the peak is extracted (step S53).
  • The [0107] image recognition unit 25, as in steps S34, S35 in the aforementioned embodiment, conducts pattern matching using the edge data not removed (step S54) and measures the distance (step S55).
  • Then, as in steps S[0108] 36, S37 in the embodiment 2-1 described above, the continuity determination unit 30 determines whether the near flag is output or not from the EHF radar 11 (step S56) and also whether the near flag is in stable state or not (step S57), the result thereof being output to the invalid edge removing unit 31.
  • In the case where the near flag is output in stable fashion, the invalid [0109] edge removing unit 31 removes the edge data having other than the very near distance information (step S58). Upon receipt of the data indicating that the near flag is output in unstable fashion, on the other hand, the edges having other than the near distance information are removed (step S59). Further, in the case where no near flag is output, the edge data having other than far distance information are removed (step S60).
  • The resulting edge data is output to the [0110] fusion processing unit 29. In the fusion processing unit 29, as in steps S41, S42 of the embodiment 2-1 described above, the fusion processing is carried out (step S61) and the result is output (step S62).
  • This embodiment also produces the same effect as the embodiment 2-1 described above. [0111]
  • Embodiment 2-3
  • FIG. 26 shows a third specific circuit configuration of a vehicle target detection system according to a second embodiment. The component parts already explained will not be explained again. [0112]
  • The [0113] continuity determination unit 30 of the processing unit 26 determines the state of the near flag output from the EHF radar 11, and sends the result data to the image recognition unit 25. In the image recognition unit 25, an area priority setting unit 32 determines the order of priority of the pattern matching areas corresponding to the input result data, and performs the pattern matching for the selected area in priority.
  • FIG. 27 shows areas for which the pattern matching is conducted. [0114]
  • Upon extraction of the edges from one of the images, as shown in FIG. 21A, a matching pattern corresponding to the edge portion is taken out and, as shown in FIG. 21D, the pattern matching is carried out for the other image. In the process, based on the data input from the [0115] continuity determination unit 30, the order of priority of areas is determined according to the edge extraction position.
  • The [0116] image recognition unit 25, upon receipt of the data indicating that a near flag is stably output, performs the pattern matching for the area of the 26th to 80th pixels from the edge extraction position as a very near area in priority over the other areas. Upon receipt of the data indicating that the near flag is output in an unstable fashion, on the other hand, the image recognition unit 25 performs the pattern matching for the area of the 10th to 25th pixels, for example, in priority as a near area. Further, upon receipt of the data indicating that no near flag is output, the image recognition unit 25 performs the pattern matching for the area of the 0th to the 9th pixels, for example, in priority as a far area.
  • The aforementioned processing will be explained in detail with reference to the flowchart of FIG. 28. [0117]
  • In the [0118] image recognition unit 25, an image is input (step S71), edges are extracted (step S72) and a peak is extracted (step S73), and the continuity determination unit 30 determines whether the near flag is output or not (step S74) and whether the near flag is stable or not (step S75). The result is output to the edge priority setting unit 32.
  • In the case where the near flag is output in a stable fashion, the edge [0119] priority setting unit 32 gives the priority to the very near distance for the pattern matching area (step S76). Upon receipt of the data indicating that the near flag is output in an unstable fashion, on the other hand, the near distance is given priority (step S77). Further, in the case where no near flag is output, the far distance is given priority (step S78).
  • The [0120] image recognition unit 25 performs the pattern matching (step S79) and measures the distance (step S80) for the area given priority. The resulting edge data is output to the fusion processing unit 29.
  • In the [0121] fusion processing unit 29, as in steps S41 and S42 of the embodiment 2-1 described above, the fusion processing is carried out (step S81) and the result is output (step S82).
  • According to this embodiment, the pattern matching is started from the area mostly likely to match, and therefore the time until successful matching is shortened. Also, the possibility of handling a similar matching pattern is reduced thereby to prevent the erroneous distance measurement. [0122]
  • Embodiment 2-4
  • FIG. 29 shows a fourth specific circuit configuration of a vehicle target detection system according to the second embodiment. The component parts already explained will not be explained again. [0123]
  • A road surface/letter [0124] edge determination unit 33 of the image recognition unit 25 determines whether an extracted edge represents a line or a letter on the road surface or not, and outputs the result to the invalid edge removing unit 34 of the processing unit 26. The invalid edge removing unit 34 removes the invalid edges from the edge data input thereto from the image recognition unit 25, and outputs the remaining edge data to the fusion processing unit 29.
  • In the [0125] image recognition unit 25, the edges are extracted according to the density difference on the image. Thus, the edges of the letters and shadows on the road surface, though not a target, are extracted undesirably according to the density difference.
  • The road surface/letter [0126] edge determination unit 33 determines whether the density difference on the road surface or a target is involved or not, based on the distance information and height information on the density difference extracted. In the case where it is determined that the density difference is that on the road surface, the edge data corresponding to the particular density difference with the road surface flag attached thereto is output to the invalid edge removing unit 34.
  • The letters written on the road surface change from the road surface color to white or yellow or from white or yellow to the road surface color in the vicinity of the edge thereof. The road surface/letter [0127] edge determination unit 33, in any of the changes mentioned above, determines that the road surface letters are detected, using the density information in the neighborhood of the extracted edge. Upon determination that the road surface letters are involved, the road surface letter determination unit 33 outputs the edge with a letter flag attached thereto to the invalid edge removing unit 34.
  • In the case where the road surface flag or the letter flag is attached to the edge data and the distance information indicates the near distance (say, not more than 10 m), the invalid [0128] edge removing unit 34 determines whether there is a near flag output from the EHF radar 11. Unless the near flag is output, the particular edge is determined as the density difference or the letters on the road surface and removed, while the remaining edge data are output to the fusion processing unit 26.
  • The aforementioned process will be explained in detail with reference to the flowchart of FIG. 30. [0129]
  • In the [0130] image recognition unit 25, as in steps S31 to S35 of the embodiment 2-1 described above, an image is input (step S91), edges are extracted (step S92), a peak is extracted (step S93), the pattern matching is carried out (step S94), and the distance is measured (step S95). By using the technique mentioned above, the road surface flag or the letter flag is attached to a predetermined edge data (step S96).
  • The invalid [0131] edge removing unit 34 determines whether the road surface flag or the letter flag exists or not (step S97), determines whether the edge distance information indicates a near distance or not (step S98), and determines whether the near flag is output or not from the EHF radar 11 (step S99). In the case where the road surface flag or the letter flag is attached, the edge distance information indicates the near distance and the near flag is not output, then the edge data of the road surface flag or the letter flag, as the case may be, is removed (step S100), and the remaining edge data is delivered to the fusion processing unit 29.
  • In the [0132] fusion processing unit 29, as in steps S41 and S42 of the embodiment 2-1 described above, the fusion processing is carried out (step S101) and the result is output (step S102).
  • The embodiment 2-4 can be modified in the following way. [0133]
  • The road surface/letter [0134] edge determination unit 33 may output only the road surface flag from the distance and height of the density difference of the road surface or, conversely, may output only the letter flag from the change in the density difference of the road surface.
  • Also, the process can be changed as shown in the flowchart of FIG. 31. Specifically, the invalid [0135] edge removing unit 34 determines whether the road surface flag or the letter flag is attached to the edge data or not and also determines in step S981 whether the distance information indicates a far distance (say, not less than 10 m). In the case where the distance information indicates a far distance, it is determined in step S991 whether the distance data output from the EHF radar 11 is within the allowable error range of the distance information of the edge data. In the case where it is not within the allowable error range, the edge data to which the road surface flag or the letter flag is attached is removed in step S100.
  • According to this embodiment, the erroneous recognition and the erroneous distance measurement in the image recognition system can be prevented before the fusion processing by use of the letter flag and the road surface flag of the image recognition system. [0136]

Claims (22)

1. A target detection system comprising:
a radar;
an image acquisition unit; and
a processing unit for specifying an area of image recognition based on the data output from said radar and processing the image data output from said image acquisition unit only for said specified area.
2. A target detection system comprising:
a radar for scanning a specified area and outputting signal of power corresponding to an object scanned;
an image acquisition unit for acquiring an image of said specified area; and
a processing unit for specifying an area of image recognition based on the power of the signal output from said radar, extracting the edge data from the image data output from said image acquisition unit only for said specified area, and detecting a target based on said edge data.
3. A target detection system according to
claim 2
,
wherein said processing unit specifies an area having said power not less than a predetermined level as said image recognition area.
4. A target detection system according to
claim 2
,
wherein said processing unit specifies an area having said power between a first predetermined level and a second predetermined level as said image recognition area.
5. A target detection system according to
claim 2
,
wherein, in the case where the level of said power has a plurality of peaks, said processing unit specifies an area constituting a valley between the peaks as said image recognition area.
6. A target detection system according to
claim 2
,
wherein said processing specifies as said image recognition area an area having said power not less than a predetermined level, an area having said power between a first predetermined level and a second predetermined level, and an area constituting a valley between peaks in the case where the level of said power has a plurality of peaks.
7. A target detection system according to
claim 2
,
wherein said processing unit extracts the peak position of the power of the signal output from said radar, checks the density change, on the left and right sides of said peak position, of the image data output from said image acquisition unit, extracts the position where the density change ceases to be laterally symmetric, and specifies an area having a predetermined width about said extraction position as said image recognition area.
8. A target detection system according to
claim 2
,
wherein said processing unit extracts the peak position of the power of the signal output from said radar, examines the density projection value, on the left and right sides of said peak position, of the image data output from said image acquisition unit, extracts the position where the density projection value ceases to be laterally symmetric, and specifies an area having a predetermined width about said extraction position as said image recognition area.
9. A target detection system comprising:
a radar for scanning a specified area and outputting signal of power corresponding to an object scanned;
an image acquisition unit for acquiring an image of said specified area; and
a processing unit for extracting the edge data from the image data output from said image acquisition unit, specifying an area of image recognition based on the power of the signal output from said radar, and detecting a target using those of said extracted edge data existing in said specified area.
10. A target detection system according to
claim 9
,
wherein said processing unit specifies an area having said power not less than a predetermined level as said image recognition area.
11. A target detection system according to
claim 9
,
wherein said processing unit specifies an area having said power between a first predetermined level and a second predetermined level as said image recognition area.
12. A target detection system according to
claim 9
,
wherein, in the case where the level of said power has a plurality of peaks, said processing unit specifies an area constituting a valley between the peaks as said image recognition area.
13. A target detection system according to
claim 9
,
wherein said processing unit specifies, as said image recognition area an area having said power not less than a predetermined level, an area having said power between a first predetermined level and a second predetermined level, and an area constituting a valley between peaks in the case where the level of said power has a plurality of the peaks.
14. A target detection system comprising:
a radar for outputting a near flag in the state corresponding to the distance upon determination that a target exists in a near area;
an image acquisition unit for acquiring the image of a specified area; and
an image recognition unit for outputting edge data by processing the image data output from said image acquisition unit;
a processing unit for determining the state of the near flag output from said radar and detecting a target based on the edge data in the distance range corresponding to the state of said near flag among the edge data output from said image recognition unit.
15. A target detection system comprising:
a radar for outputting a near flag in the state corresponding to the distance upon determination that a target exists in a near area;
an image acquisition unit for acquiring the image of a specified area;
an image recognition unit for acquiring edge data by processing the image data output from said image acquisition unit and, among said edge data, removing and outputting the edge data in the distance range corresponding to the state of the near flag output from a processing unit; and
a processing unit for determining the state of the near flag output from said radar, outputting the result thereof to said image recognition unit, and detecting a target based on the edge data output from said image recognition unit.
16. A target detection system comprising:
a radar for outputting a near flag in the state corresponding to the distance upon determination that a target exists in a near area;
an image acquisition unit for acquiring the image of a specified area;
an image recognition unit for acquiring edge data by processing the image data output from said image acquisition unit and, among said edge data, performing the pattern matching processing in priority for the edge data within the distance range corresponding to the state of the near flag output from a processing unit, and outputting said edge data; and
a processing unit for determining the state of the near flag output from said radar, outputting the result thereof to said image recognition unit, and detecting a target based on the edge data output from said image recognition unit.
17. A target detection system comprising:
a radar for outputting a near flag in the state corresponding to the distance upon determination that a target exists in a near area;
an image acquisition unit for acquiring the image of a specified area;
an image recognition unit for acquiring edge data by processing the image data output from said image acquisition unit and, among said edge data, outputting by attaching the road surface flag to the edge data identified as an edge of the road surface from the height information and the distance information obtained from said image data; and
a processing unit for determining the distance range corresponding to the state of the near flag output from said radar, invalidating the edge data output from said image recognition unit and having said road surface flag attached thereto, in the case where the distance information of said edge data with said road surface flag attached thereto indicates a near distance and said determined distance range indicates a far distance, and performing said target detection process based on the remaining edge data.
18. A target detection system comprising:
a radar for outputting distance information;
an image acquisition unit for acquiring the image of a specified area;
an image recognition unit for acquiring edge data by processing the image data output from said image acquisition unit and, among said edge data, outputting by attaching the road surface flag to the edge data identified as an edge of the road surface based on the height information and the distance information obtained from said image data; and
a processing unit for determining whether the distance information of the edge data output from said image recognition unit and having said road surface flag attached thereto is within the allowable error range of the distance information acquired from said radar, in the case where said edge data with said road surface flag attached thereto indicates a far distance, and invalidates said edge data and performs the target detection process based on the remaining edge data in the case where said distance information is not within said allowable error range.
19. A target detection system comprising:
a radar for outputting a near flag in the state corresponding to the distance upon determination that a target exists in a near area;
an image acquisition unit for acquiring the image of a specified area;
an image recognition unit for acquiring edge data by processing the image data output from said image acquisition unit and, among said edge data, outputting by attaching a letter flag to the edge data identified as a letter on the road surface based on the density information obtained from said image data; and
a processing unit for determining the distance range corresponding to the state of the near flag output from said radar, invalidating the edge data output from said image recognition unit and having said letter flag attached thereto in the case where the distance information of the edge data with said letter flag attached thereto indicates a near distance and said determined distance range indicates a far distance, and performing said target detection process based on the remaining edge data.
20. A target detection system comprising:
a radar for outputting distance information;
an image acquisition unit for acquiring the image of a specified area;
an image recognition unit for acquiring edge data by processing the image data output from said image acquisition unit and, among said edge data, outputting by attaching a letter flag to the edge data identified as a letter on the road surface based on the density information obtained from said image data; and
a processing unit for determining whether the distance information of the edge data output from said image recognition unit and having said letter flag attached thereto is within the allowable error range of the distance information acquired from said radar, in the case where said distance information of the edge data with said letter flag attached thereto indicates a far distance, and invalidating said edge data and performing the target detection process based on the remaining edge data in the case where said distance information is not within said allowable range.
21. A target detection system comprising:
a radar for outputting a near flag in the state corresponding to the distance upon determination that a target exists in a near area;
an image acquisition unit for acquiring the image of a specified area;
an image recognition unit for acquiring edge data by processing the image data output from said image acquisition unit and, among said edge data, outputting by attaching a road surface flag to the edge data identified as a letter on the road surface based on the height information and the distance information obtained from said image data, and also by attaching a letter flag to the edge data identified as a letter on the road surface based on the density information; and
a processing unit for determining the distance range corresponding to the state of the near flag output from said radar, and invalidating the edge data output from said image recognition unit and having at least one of said road surface flag and said letter flag attached thereto, in the case where the distance information of said edge data having at least one of said road surface flag and said letter flag attached thereto indicates a near distance and said determined distance range indicates a far distance, and performing said target detection process based on the remaining edge data.
22. A target detection system comprising:
a radar for outputting a near flag in the state corresponding to the distance upon determination that a target exists in a near area;
an image acquisition unit for acquiring the image of a specified area;
an image recognition unit for acquiring edge data by processing the image data output form said image acquisition unit and, among said edge data, outputting by attaching a road surface flag to the edge data identified as a letter on the road surface based on the height information and the distance information obtained from said image data and also by attaching a letter flag to the edge data identified as a letter on the road surface based on the density information; and
a processing unit for determining whether the distance information of the edge data output from said image recognition unit and having at least one of said road surface flag and said letter flag attached thereto is within the allowable error range of the distance data obtained from said radar, in the case where the distance information of said edge data having at least one of said road surface flag and said letter flag attached thereto indicates a far distance, and invalidating said edge data having at least one of said road surface flag and said letter flag attached thereto and performing said target detection process based on the remaining edge data, in the case where said distance information is not within said allowable error range.
US09/834,403 2000-04-14 2001-04-13 Target detection system using radar and image processing Abandoned US20010031068A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/188,160 US7376247B2 (en) 2000-04-14 2005-07-21 Target detection system using radar and image processing

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2000118549A JP4308405B2 (en) 2000-04-14 2000-04-14 Object detection device
JP2000-118549 2000-04-14
JP2000152695A JP4311861B2 (en) 2000-05-18 2000-05-18 Vehicle object detection device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US11/188,160 Division US7376247B2 (en) 2000-04-14 2005-07-21 Target detection system using radar and image processing

Publications (1)

Publication Number Publication Date
US20010031068A1 true US20010031068A1 (en) 2001-10-18

Family

ID=26590424

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/834,403 Abandoned US20010031068A1 (en) 2000-04-14 2001-04-13 Target detection system using radar and image processing
US11/188,160 Expired - Fee Related US7376247B2 (en) 2000-04-14 2005-07-21 Target detection system using radar and image processing

Family Applications After (1)

Application Number Title Priority Date Filing Date
US11/188,160 Expired - Fee Related US7376247B2 (en) 2000-04-14 2005-07-21 Target detection system using radar and image processing

Country Status (1)

Country Link
US (2) US20010031068A1 (en)

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030204384A1 (en) * 2002-04-24 2003-10-30 Yuri Owechko High-performance sensor fusion architecture
WO2005024460A1 (en) * 2003-09-11 2005-03-17 Toyota Jidosha Kabushiki Kaisha Object detection system and object detection method
US20050110672A1 (en) * 2003-10-10 2005-05-26 L-3 Communications Security And Detection Systems, Inc. Mmw contraband screening system
US20050195384A1 (en) * 2004-03-08 2005-09-08 Fumio Ohtomo Surveying method and surveying instrument
US20050270225A1 (en) * 2004-06-02 2005-12-08 Setsuo Tokoro Obstacle recognition system and obstacle recognition method
US20060091653A1 (en) * 2004-11-04 2006-05-04 Autoliv Asp, Inc. System for sensing impending collision and adjusting deployment of safety device
US20060091654A1 (en) * 2004-11-04 2006-05-04 Autoliv Asp, Inc. Sensor system with radar sensor and vision sensor
US20060155469A1 (en) * 2003-07-11 2006-07-13 Tomoya Kawasaki Crash-safe vehicle control system
GB2424527A (en) * 2003-07-30 2006-09-27 Ford Motor Co Collision warning and countermeasure system for an automobile
US20060242198A1 (en) * 2005-04-22 2006-10-26 Microsoft Corporation Methods, computer-readable media, and data structures for building an authoritative database of digital audio identifier elements and identifying media items
US20070165910A1 (en) * 2006-01-17 2007-07-19 Honda Motor Co., Ltd. Vehicle surroundings monitoring apparatus, method, and program
US7656508B2 (en) 2005-05-19 2010-02-02 Olympus Corporation Distance measuring apparatus, distance measuring method, and computer program product
US20100094508A1 (en) * 2008-10-15 2010-04-15 Michel Kozyreff Sensor system including a confirmation sensor for detecting an impending collision
US20100225522A1 (en) * 2009-03-06 2010-09-09 Demersseman Bernard Guy Sensor system for detecting an impending collision of a vehicle
CN104380039A (en) * 2012-07-31 2015-02-25 哈曼国际工业有限公司 System and method for detecting obstacles using a single camera
US20150217765A1 (en) * 2014-02-05 2015-08-06 Toyota Jidosha Kabushiki Kaisha Collision prevention control apparatus
US20150234044A1 (en) * 2012-09-03 2015-08-20 Toyota Jidosha Kabushiki Kaisha Collision determination device and collision determination method
CN105136036A (en) * 2015-09-24 2015-12-09 中国科学院上海高等研究院 Portable three-dimensional scanning system integrating images and laser
CN105572663A (en) * 2014-09-19 2016-05-11 通用汽车环球科技运作有限责任公司 Detection of a distributed radar target based on an auxiliary sensor
CN105701762A (en) * 2015-12-30 2016-06-22 联想(北京)有限公司 Picture processing method and electronic equipment
US9463744B2 (en) * 2001-07-31 2016-10-11 Magna Electronics Inc. Driver assistance system for a vehicle
CN107067003A (en) * 2017-03-09 2017-08-18 百度在线网络技术(北京)有限公司 Extracting method, device, equipment and the computer-readable storage medium of region of interest border
US20180218228A1 (en) * 2017-01-31 2018-08-02 Denso Corporation Apparatus and method for controlling vehicle
US20190279386A1 (en) * 2016-11-30 2019-09-12 Naoki MOTOHASHI Information processing device, imaging device, apparatus control system, information processing method, and computer program product
WO2020133206A1 (en) * 2018-12-28 2020-07-02 深圳市大疆创新科技有限公司 Radar simulation method and apparatus
CN111653054A (en) * 2020-04-16 2020-09-11 国家核安保技术中心 Physical protection system for nuclear facilities
CN113075638A (en) * 2021-04-30 2021-07-06 深圳安德空间技术有限公司 Multi-source data synchronous acquisition and fusion method and system for underground space exploration
CN114428235A (en) * 2022-01-07 2022-05-03 西安电子科技大学 Space micro-motion target identification method based on decision-level fusion

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8330647B2 (en) * 2006-06-08 2012-12-11 Vista Research, Inc. Sensor suite and signal processing for border surveillance
JP5042558B2 (en) * 2006-08-10 2012-10-03 富士通テン株式会社 Radar equipment
JP4876080B2 (en) * 2008-01-25 2012-02-15 富士重工業株式会社 Environment recognition device
JP4956452B2 (en) * 2008-01-25 2012-06-20 富士重工業株式会社 Vehicle environment recognition device
RU2444758C1 (en) * 2010-06-16 2012-03-10 Открытое акционерное общество "Головное системное конструкторское бюро Концерна ПВО "Алмаз-Антей" имени академика А.А. Расплетина" (ОАО "ГСКБ "Алмаз-Антей") Method for determining number, velocity and range of targets and amplitudes of signals reflected from them as per return signal in digital channel of radar
US9041589B2 (en) * 2012-04-04 2015-05-26 Caterpillar Inc. Systems and methods for determining a radar device coverage region
US9052393B2 (en) 2013-01-18 2015-06-09 Caterpillar Inc. Object recognition system having radar and camera input
US9167214B2 (en) 2013-01-18 2015-10-20 Caterpillar Inc. Image processing system using unified images
US9582886B2 (en) * 2013-07-08 2017-02-28 Honda Motor Co., Ltd. Object recognition device
JP5929870B2 (en) * 2013-10-17 2016-06-08 株式会社デンソー Target detection device
US10254395B2 (en) * 2013-12-04 2019-04-09 Trimble Inc. System and methods for scanning with integrated radar detection and image capture
US9335178B2 (en) * 2014-01-28 2016-05-10 GM Global Technology Operations LLC Method for using street level images to enhance automated driving mode for vehicle
CN104007047B (en) * 2014-06-17 2016-01-27 青岛理工大学 Particle system power chain identification method
EP3164859B1 (en) * 2014-07-03 2022-08-31 GM Global Technology Operations LLC Vehicle radar methods and systems
DE102018210814A1 (en) * 2018-06-30 2020-01-02 Robert Bosch Gmbh Method for the detection of static radar targets with a radar sensor for motor vehicles

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5430450A (en) * 1993-02-10 1995-07-04 Ford Motor Company Method and apparatus for automatically dimming motor vehicle headlights using radar signal
US5706355A (en) * 1991-03-22 1998-01-06 Thomson-Csf Method of analyzing sequences of road images, device for implementing it and its application to detecting obstacles
US6275773B1 (en) * 1993-08-11 2001-08-14 Jerome H. Lemelson GPS vehicle collision avoidance warning and control system and method
US6377191B1 (en) * 1999-05-25 2002-04-23 Fujitsu Limited System for assisting traffic safety of vehicles
US6414712B1 (en) * 1995-12-13 2002-07-02 Daimlerchrylsler, Ag Vehicle navigational system and signal processing method for said navigational system
US6498972B1 (en) * 2002-02-13 2002-12-24 Ford Global Technologies, Inc. Method for operating a pre-crash sensing system in a vehicle having a countermeasure system
US6763125B2 (en) * 1999-09-29 2004-07-13 Fujitsu Ten Limited Image recognition apparatus and image processing apparatus

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7202776B2 (en) * 1997-10-22 2007-04-10 Intelligent Technologies International, Inc. Method and system for detecting objects external to a vehicle
DE59913752D1 (en) * 1998-07-13 2006-09-21 Contraves Ag Method for tracking moving objects based on specific features
US6042050A (en) * 1999-02-16 2000-03-28 The United States Of America As Represented By The Secretary Of The Army Synthetic discriminant function automatic target recognition system augmented by LADAR
JP2003501635A (en) * 1999-05-26 2003-01-14 ローベルト ボッシュ ゲゼルシャフト ミット ベシュレンクテル ハフツング Object detection system
US6259803B1 (en) * 1999-06-07 2001-07-10 The United States Of America As Represented By The Secretary Of The Navy Simplified image correlation method using off-the-shelf signal processors to extract edge information using only spatial data

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5706355A (en) * 1991-03-22 1998-01-06 Thomson-Csf Method of analyzing sequences of road images, device for implementing it and its application to detecting obstacles
US5430450A (en) * 1993-02-10 1995-07-04 Ford Motor Company Method and apparatus for automatically dimming motor vehicle headlights using radar signal
US6275773B1 (en) * 1993-08-11 2001-08-14 Jerome H. Lemelson GPS vehicle collision avoidance warning and control system and method
US6414712B1 (en) * 1995-12-13 2002-07-02 Daimlerchrylsler, Ag Vehicle navigational system and signal processing method for said navigational system
US6377191B1 (en) * 1999-05-25 2002-04-23 Fujitsu Limited System for assisting traffic safety of vehicles
US6763125B2 (en) * 1999-09-29 2004-07-13 Fujitsu Ten Limited Image recognition apparatus and image processing apparatus
US6498972B1 (en) * 2002-02-13 2002-12-24 Ford Global Technologies, Inc. Method for operating a pre-crash sensing system in a vehicle having a countermeasure system

Cited By (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170028916A1 (en) * 2001-07-31 2017-02-02 Magna Electronics Inc. Driver assistance system for a vehicle
US10406980B2 (en) * 2001-07-31 2019-09-10 Magna Electronics Inc. Vehicular lane change system
US20190039516A1 (en) * 2001-07-31 2019-02-07 Magna Electronics Inc. Vehicular lane change system
US9463744B2 (en) * 2001-07-31 2016-10-11 Magna Electronics Inc. Driver assistance system for a vehicle
US10099610B2 (en) * 2001-07-31 2018-10-16 Magna Electronics Inc. Driver assistance system for a vehicle
US20030204384A1 (en) * 2002-04-24 2003-10-30 Yuri Owechko High-performance sensor fusion architecture
US7715591B2 (en) * 2002-04-24 2010-05-11 Hrl Laboratories, Llc High-performance sensor fusion architecture
US20060155469A1 (en) * 2003-07-11 2006-07-13 Tomoya Kawasaki Crash-safe vehicle control system
US7613568B2 (en) 2003-07-11 2009-11-03 Toyota Jidosha Kabushiki Kaisha Crash-safe vehicle control system
GB2424527B (en) * 2003-07-30 2007-07-11 Ford Motor Co A method and system for performing object detection
GB2424527A (en) * 2003-07-30 2006-09-27 Ford Motor Co Collision warning and countermeasure system for an automobile
WO2005024460A1 (en) * 2003-09-11 2005-03-17 Toyota Jidosha Kabushiki Kaisha Object detection system and object detection method
US7417580B2 (en) 2003-09-11 2008-08-26 Toyota Jidosha Kabushiki Kaisha Object detection system and object detection method
US20070080850A1 (en) * 2003-09-11 2007-04-12 Kyoichi Abe Object detection system and object detection method
US20050110672A1 (en) * 2003-10-10 2005-05-26 L-3 Communications Security And Detection Systems, Inc. Mmw contraband screening system
US7889113B2 (en) * 2003-10-10 2011-02-15 L-3 Communications Security and Detection Systems Inc. Mmw contraband screening system
EP1574821A2 (en) 2004-03-08 2005-09-14 Kabushiki Kaisha Topcon Surveying method and surveying instrument
EP1574821A3 (en) * 2004-03-08 2009-11-04 Kabushiki Kaisha Topcon Surveying method and surveying instrument
US7764809B2 (en) 2004-03-08 2010-07-27 Kabushiki Kaisha Topcon Surveying method and surveying instrument
US20050195384A1 (en) * 2004-03-08 2005-09-08 Fumio Ohtomo Surveying method and surveying instrument
US20050270225A1 (en) * 2004-06-02 2005-12-08 Setsuo Tokoro Obstacle recognition system and obstacle recognition method
US7570198B2 (en) * 2004-06-02 2009-08-04 Toyota Jidosha Kabushiki Kaisha Obstacle recognition system and obstacle recognition method
US20060091653A1 (en) * 2004-11-04 2006-05-04 Autoliv Asp, Inc. System for sensing impending collision and adjusting deployment of safety device
US20060091654A1 (en) * 2004-11-04 2006-05-04 Autoliv Asp, Inc. Sensor system with radar sensor and vision sensor
US20060242198A1 (en) * 2005-04-22 2006-10-26 Microsoft Corporation Methods, computer-readable media, and data structures for building an authoritative database of digital audio identifier elements and identifying media items
US7656508B2 (en) 2005-05-19 2010-02-02 Olympus Corporation Distance measuring apparatus, distance measuring method, and computer program product
US8175331B2 (en) * 2006-01-17 2012-05-08 Honda Motor Co., Ltd. Vehicle surroundings monitoring apparatus, method, and program
US20070165910A1 (en) * 2006-01-17 2007-07-19 Honda Motor Co., Ltd. Vehicle surroundings monitoring apparatus, method, and program
US8095276B2 (en) 2008-10-15 2012-01-10 Autoliv Asp, Inc. Sensor system including a confirmation sensor for detecting an impending collision
US20100094508A1 (en) * 2008-10-15 2010-04-15 Michel Kozyreff Sensor system including a confirmation sensor for detecting an impending collision
US20100225522A1 (en) * 2009-03-06 2010-09-09 Demersseman Bernard Guy Sensor system for detecting an impending collision of a vehicle
US9798936B2 (en) 2012-07-31 2017-10-24 Harman International Industries, Incorporated System and method for detecting obstacles using a single camera
CN104380039A (en) * 2012-07-31 2015-02-25 哈曼国际工业有限公司 System and method for detecting obstacles using a single camera
US9405006B2 (en) * 2012-09-03 2016-08-02 Toyota Jidosha Kabushiki Kaisha Collision determination device and collision determination method
US20150234044A1 (en) * 2012-09-03 2015-08-20 Toyota Jidosha Kabushiki Kaisha Collision determination device and collision determination method
US9481365B2 (en) * 2014-02-05 2016-11-01 Toyota Jidosha Kabushiki Kaisha Collision prevention control apparatus
US20150217765A1 (en) * 2014-02-05 2015-08-06 Toyota Jidosha Kabushiki Kaisha Collision prevention control apparatus
US10088561B2 (en) 2014-09-19 2018-10-02 GM Global Technology Operations LLC Detection of a distributed radar target based on an auxiliary sensor
CN105572663A (en) * 2014-09-19 2016-05-11 通用汽车环球科技运作有限责任公司 Detection of a distributed radar target based on an auxiliary sensor
CN105136036A (en) * 2015-09-24 2015-12-09 中国科学院上海高等研究院 Portable three-dimensional scanning system integrating images and laser
CN105701762A (en) * 2015-12-30 2016-06-22 联想(北京)有限公司 Picture processing method and electronic equipment
US20190279386A1 (en) * 2016-11-30 2019-09-12 Naoki MOTOHASHI Information processing device, imaging device, apparatus control system, information processing method, and computer program product
US10762656B2 (en) * 2016-11-30 2020-09-01 Ricoh Company, Ltd. Information processing device, imaging device, apparatus control system, information processing method, and computer program product
US10592755B2 (en) * 2017-01-31 2020-03-17 Denso Corporation Apparatus and method for controlling vehicle
US20180218228A1 (en) * 2017-01-31 2018-08-02 Denso Corporation Apparatus and method for controlling vehicle
CN107067003A (en) * 2017-03-09 2017-08-18 百度在线网络技术(北京)有限公司 Extracting method, device, equipment and the computer-readable storage medium of region of interest border
WO2020133206A1 (en) * 2018-12-28 2020-07-02 深圳市大疆创新科技有限公司 Radar simulation method and apparatus
CN111653054A (en) * 2020-04-16 2020-09-11 国家核安保技术中心 Physical protection system for nuclear facilities
CN113075638A (en) * 2021-04-30 2021-07-06 深圳安德空间技术有限公司 Multi-source data synchronous acquisition and fusion method and system for underground space exploration
CN114428235A (en) * 2022-01-07 2022-05-03 西安电子科技大学 Space micro-motion target identification method based on decision-level fusion

Also Published As

Publication number Publication date
US7376247B2 (en) 2008-05-20
US20050271253A1 (en) 2005-12-08

Similar Documents

Publication Publication Date Title
US7376247B2 (en) Target detection system using radar and image processing
US5617085A (en) Method and apparatus for monitoring the surroundings of a vehicle and for detecting failure of the monitoring apparatus
EP2422320B1 (en) Object detection device
JP3671825B2 (en) Inter-vehicle distance estimation device
US6477260B1 (en) Position measuring apparatus using a pair of electronic cameras
US6670912B2 (en) Method for detecting stationary object located above road
EP0810569B1 (en) Lane detection sensor and navigation system employing the same
JP3630100B2 (en) Lane detection device
US20040178945A1 (en) Object location system for a road vehicle
JP2003296736A (en) Device for detecting obstacle and method thereof
KR101180621B1 (en) Apparatus and method for detecting a vehicle
JPH08329393A (en) Preceding vehicle detector
JP2003099762A (en) Precedent vehicle recognition device and recognition method
JP2005329779A (en) Method and device for recognizing obstacle
JP4308405B2 (en) Object detection device
JPH08156723A (en) Vehicle obstruction detecting device
JP4719996B2 (en) Object detection device
JP2003281700A (en) Cutting-in vehicle detecting device and method
JP3384278B2 (en) Distance measuring device
US20200098126A1 (en) Object detection apparatus
JP2003077099A (en) Preceding vehicle recognition system
JP4311861B2 (en) Vehicle object detection device
JP3868915B2 (en) Forward monitoring apparatus and method
JP2003308599A (en) Traveling route environment detector
JP2002008019A (en) Railway track recognition device and rolling stock using railway track recognition device

Legal Events

Date Code Title Description
AS Assignment

Owner name: FUJITSU TEN LIMITED, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:OHTA, AKIHIRO;OKA, KENJI;REEL/FRAME:011713/0918

Effective date: 20010405

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION