JP2962799B2 - Roadside detection device for mobile vehicles - Google Patents
Roadside detection device for mobile vehiclesInfo
- Publication number
- JP2962799B2 JP2962799B2 JP2257218A JP25721890A JP2962799B2 JP 2962799 B2 JP2962799 B2 JP 2962799B2 JP 2257218 A JP2257218 A JP 2257218A JP 25721890 A JP25721890 A JP 25721890A JP 2962799 B2 JP2962799 B2 JP 2962799B2
- Authority
- JP
- Japan
- Prior art keywords
- luminance
- variance
- information
- image
- input image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
- 238000001514 detection method Methods 0.000 title claims description 11
- 239000006185 dispersion Substances 0.000 claims description 19
- 239000000284 extract Substances 0.000 claims description 7
- 238000000605 extraction Methods 0.000 claims description 4
- 238000003708 edge detection Methods 0.000 claims description 3
- 238000000034 method Methods 0.000 description 9
- 238000010586 diagram Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 2
- 239000010426 asphalt Substances 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
Landscapes
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Image Processing (AREA)
Description
【発明の詳細な説明】 (産業上の利用分野) 本発明は、画像処理にて車両の走行路環境を認識する
移動車の走行路端検出装置に関する。Description: BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a traveling road edge detection device for a moving vehicle that recognizes a traveling road environment of a vehicle by image processing.
(従来の技術) 従来、ビデオカメラ等にて入力した画像情報から、画
像に含まれる不要成分を除去するための補正処理を行な
うことで道路の両端に引かれた白線を抽出し、走行路端
を認識している。(Prior Art) Conventionally, white lines drawn at both ends of a road are extracted from image information input by a video camera or the like by performing a correction process for removing unnecessary components included in the image, and the roadside edge is extracted. Are aware of
(発明が解決しようとしている課題) しかしながら、上記従来例では、道路環境としての道
路面に影や照り返し等が存在する場合、それを補正する
ために要する処理量が多くなり、また、補正時間が長く
なるので走行路端の高速な認識ができないという問題が
ある。(Problems to be Solved by the Invention) However, in the above-described conventional example, when a shadow or reflection exists on a road surface as a road environment, the processing amount required to correct the shadow and reflection increases, and the correction time is increased. There is a problem that high-speed recognition of the road end cannot be performed because the length of the road becomes long.
本発明はかかる点に鑑みてなされたものであり、その
目的とするところは、道路環境を認識する処理に対する
補正のための有効な方法を提案し、走行路端の高速な認
識を可能にすることである。The present invention has been made in view of such a point, and an object of the present invention is to propose an effective method for correcting a process of recognizing a road environment, thereby enabling high-speed recognition of a roadside. That is.
(課題を解決するための手段) 本発明は、上述の課題を解決することを目的としてな
され、上述の課題を解決する一手段として以下の構成を
備える。即ち、 外界認識のための画像入力手段を備えた移動車の走行
路端検出装置であって、入力画像の輝度情報を抽出する
輝度抽出手段と、入力画像の分散情報を抽出する分散抽
出手段と、前記輝度情報と分散情報とに基づいて、走行
路上の白線を検出する検出手段とを備える。(Means for Solving the Problems) The present invention has been made for the purpose of solving the above problems, and has the following configuration as one means for solving the above problems. That is, a traveling roadside detection device for a mobile vehicle including an image input unit for external world recognition, comprising: a luminance extraction unit that extracts luminance information of an input image; and a variance extraction unit that extracts variance information of an input image. Detecting means for detecting a white line on a traveling road based on the luminance information and the dispersion information.
また、好ましくは、前記輝度抽出手段は、前記入力画
像の輝度と所定閾値とを比較して得られる輝度分布に基
づいて前記輝度情報を抽出する。Preferably, the luminance extracting means extracts the luminance information based on a luminance distribution obtained by comparing the luminance of the input image with a predetermined threshold.
また、好ましくは、前記分散抽出手段は、前記入力画
像の分散と所定閾値とを比較して得られる分散係数に基
づいて前記分散情報を抽出する。Preferably, the variance extracting unit extracts the variance information based on a variance coefficient obtained by comparing a variance of the input image with a predetermined threshold.
また、好ましくは、前記輝度抽出手段は、前記輝度情
報として前記入力画像の輝度と所定閾値とを比較して輝
度分布を算出し、前記分散抽出手段は、前記分散情報と
して前記入力画像の分散と所定閾値とを比較して分散係
数を算出し、前記検出手段は、前記輝度分布と分散係数
とを比較して、該輝度分布が大きく、且つ該分散係数が
小さい領域を走行路上の白線として検出する。Preferably, the luminance extracting means calculates a luminance distribution by comparing a luminance of the input image with the predetermined threshold value as the luminance information, and the variance extracting means calculates a variance of the input image as the variance information. A variance coefficient is calculated by comparing the variance coefficient with a predetermined threshold, and the detecting unit compares the luminance distribution with the variance coefficient, and detects an area where the luminance distribution is large and the variance coefficient is small as a white line on a traveling road. I do.
(作用) 以上の構成において、輝度情報と分散情報とから走行
路上の白線を高速に認識する。(Operation) In the above configuration, the white line on the traveling road is recognized at high speed from the luminance information and the dispersion information.
(実施例) 以下、添付図面を参照して本発明に係る好適な一実施
例を詳細に説明する。Hereinafter, a preferred embodiment according to the present invention will be described in detail with reference to the accompanying drawings.
第1図は、本発明に係る一実施例である移動車の走行
路端検出装置(以下、装置という)の構成を示すブロツ
ク図である。FIG. 1 is a block diagram showing a configuration of a traveling roadside detection device (hereinafter, referred to as a device) of a moving vehicle according to an embodiment of the present invention.
車両前方の走行路環境は、第1図に示したCCDカメラ
等のビデオカメラ1にて捕らえられ、それが画像情報と
して画像入力部1に入力される。また、画像入力部1
は、入力した画像情報を分散計算部3、及び輝度計算部
4に同時に出力する。The traveling road environment ahead of the vehicle is captured by a video camera 1 such as a CCD camera shown in FIG. 1, and is input to the image input unit 1 as image information. Also, the image input unit 1
Outputs the input image information to the variance calculator 3 and the brightness calculator 4 at the same time.
分散計算部3は、入力した画像情報をもとに走行路面
の分散係数を計算する。また、輝度計算部4は、画像情
報から走行路面の輝度を計算する。これら分散計算部
3、及び輝度計算部4での計算処理は同時に実行され、
その計算結果は、各々分散情報、輝度情報として白線検
出部5に送られる。また、白線検出部5は、入力した分
散情報、輝度情報をもとに、後述する方法にて走行路面
上の白線領域を抽出する。The variance calculation unit 3 calculates a variance coefficient of the traveling road surface based on the input image information. Further, the brightness calculation unit 4 calculates the brightness of the traveling road surface from the image information. The calculation processes in the variance calculator 3 and the brightness calculator 4 are executed simultaneously,
The calculation results are sent to the white line detection unit 5 as dispersion information and luminance information, respectively. In addition, the white line detection unit 5 extracts a white line region on the traveling road surface based on the input dispersion information and luminance information by a method described later.
主制御部6は、本装置全体を制御すると共に、白線検
出部5にて検出された白線領域に基づいて、車両の移動
に必要な走行路構造を認識する。The main control unit 6 controls the entire apparatus and recognizes a traveling road structure necessary for the movement of the vehicle based on the white line area detected by the white line detection unit 5.
以下、本実施例における白線の認識について詳細に説
明する。Hereinafter, recognition of the white line in the present embodiment will be described in detail.
第2図は、ビデオカメラ1にて捕らえた車両前方の画
像である。図中、破線Fは無限遠点Vを通過する水平線
であり、ビデオカメラ1の傾き、焦点距離、及び撮像面
の大きさが既知であれば、自車の近傍領域での走行路面
の傾きは無視できるので、破線Fの位置(画面上での座
標)は数値的な計算にて求めることができる。FIG. 2 is an image of the front of the vehicle captured by the video camera 1. In the figure, the broken line F is a horizontal line passing through the point V at infinity, and if the inclination, the focal length, and the size of the imaging surface of the video camera 1 are known, the inclination of the traveling road surface in the area near the own vehicle is Since it can be ignored, the position of the broken line F (coordinates on the screen) can be obtained by numerical calculation.
また、白線抽出の目標となる走行路領域は、画面上で
破線F(その座標をy=Fとする)より下方に位置する
ため、その領域を輝度計算や分散計算の探索対象とす
る。しかし、無限遠点V付近の輝度や色の変化は複雑か
つ曖昧であることから、上述の計算の探索領域として
は、その領域を除外する。従つて、除外する領域のy座
標方向の幅をαとすると、計算の探索領域は、y=F−
αより下方の領域となる。Further, since the travel path area as the target of white line extraction is located below the broken line F (the coordinates of which are y = F) on the screen, the area is set as a search target for luminance calculation and dispersion calculation. However, since the change in brightness and color near the point V at infinity is complicated and ambiguous, the search area for the above calculation is excluded. Therefore, assuming that the width of the region to be excluded in the y coordinate direction is α, the search region for calculation is y = F−
It is a region below α.
次に、本実施例における輝度情報及び分散情報の算出
方法について説明する。Next, a method of calculating luminance information and shared information in the present embodiment will be described.
輝度計算部4は、入力した画像情報に対して所定の領
域を走査し、内蔵する不図示の比較器にて、その領域の
画像の輝度とあらかじめ設定した閾値とを比較する。そ
して、得られた輝度の分布(ヒストグラム)を算出し、
それを出力する。The luminance calculator 4 scans a predetermined area with respect to the input image information, and compares the luminance of the image in the area with a preset threshold value using a built-in comparator (not shown). Then, the obtained luminance distribution (histogram) is calculated,
Output it.
例えば、第2図に示した画像のy=aで示される位置
にてx軸方向に走査を行ない、上述の比較を実行して走
査線上にある画像で、その輝度が高いものを検出する。
その結果、第3図(a)に示す輝度分布(分布の強さを
Iとする)が得られたとすると、同じ走査線上の他の画
像領域に比べてその輝度が顕著な領域に、図中、P1,P2
として示すような分布のピークができる。同様な処理
を、第2図のy=bにて実行した結果得られた輝度分布
が、第3図(c)である。For example, scanning is performed in the x-axis direction at the position indicated by y = a in the image shown in FIG. 2, and the above-described comparison is executed to detect an image on the scanning line having a high luminance.
As a result, assuming that the luminance distribution shown in FIG. 3 (a) (the intensity of the distribution is I) is obtained, an area where the luminance is more remarkable than other image areas on the same scanning line is shown in FIG. , P 1 , P 2
A peak having a distribution as shown in FIG. FIG. 3C shows the luminance distribution obtained as a result of executing the same processing at y = b in FIG.
次に、分散情報の算出方法について説明する。 Next, a method of calculating shared information will be described.
一般に、走行路面のテクスチヤは一様であり、特に、
アスフアルト面に対する画像の分散は、白線や車両、ガ
ードレール等の分散と比較すると2〜3倍程度あると考
えられている。換言すれば、走行路上の白線等からはラ
ンダムノイズではない、一様な照り返しが得られ、白線
等に対する画像は分散が少ないということである。In general, the texture of the road surface is uniform,
It is considered that the dispersion of the image on the asphalt surface is about two to three times as large as the dispersion of the white line, the vehicle, the guardrail, and the like. In other words, a uniform reflection is obtained from a white line or the like on the traveling road, which is not random noise, and an image for the white line or the like has little variance.
そこで、分散計算部3は、輝度計算部4と同様、入力
した画像情報に対して所定の領域を走査し、その領域の
画像の分散係数を算出する。例えば、第2図に示した画
像のy=a、及びy=bで示される位置にてx軸方向
に、その領域の分散係数を算出する。y=aについて算
出した結果を第3図(b)に、また、y=bについての
結果を第3図(d)に示す。Therefore, the variance calculation unit 3 scans a predetermined area with the input image information and calculates the variance coefficient of the image in the area, similarly to the luminance calculation unit 4. For example, the variance coefficient of the area is calculated in the x-axis direction at the position indicated by y = a and y = b in the image shown in FIG. FIG. 3 (b) shows the result calculated for y = a, and FIG. 3 (d) shows the result for y = b.
前述の如く、走行路面の分散係数は、白線等と比較す
ると明らかに異なる場合が多いので、第3図(b),
(d)に示すように、走行路面の状況に対応した分散係
数(図中、Vは分散係数を示す)の谷P5〜P8が出現す
る。As described above, the dispersion coefficient of the traveling road surface is often significantly different from that of the white line or the like.
As shown in (d), (in the figure, V is shows the dispersion coefficient) dispersion coefficients corresponding to the situation of the road surface valleys P 5 to P 8 in the appearance.
走行路面の分散係数は、車両近傍の画像では白線と走
行路との差が顕著であるが、遠方の画像では分散による
白線と走行路との区別がつけ難い。そこで、上述の輝度
分布、及び分散係数の算出を、車両の近傍、即ち、第2
図に示した画像の下方から順次行ない、画像の遠方に対
しては、前述のように無限遠点V付近を除外するため。
y=F−αでその処理を終了する。そして、白線検出部
5にて、輝度計算部4及び分散計算部3で得られた各走
査毎の輝度分布と分散係数の比較を行ない、輝度分布が
大きく、かつ分散係数が小さいという条件を満たす領域
を走行路端の白線とする。Regarding the dispersion coefficient of the road surface, the difference between the white line and the road is remarkable in the image near the vehicle, but it is difficult to distinguish the white line and the road due to the dispersion in the image far away. Therefore, the calculation of the luminance distribution and the dispersion coefficient described above is performed in the vicinity of the vehicle, that is, in the second case.
The processing is performed sequentially from the lower side of the image shown in the figure, and in the distant part of the image, the vicinity of the point V at infinity is excluded as described above.
The process ends when y = F−α. Then, the white line detection unit 5 compares the luminance distribution for each scan obtained by the luminance calculation unit 4 and the variance calculation unit 3 with the dispersion coefficient, and satisfies the condition that the luminance distribution is large and the dispersion coefficient is small. The area is defined as a white line at the end of the traveling road.
これを第3図(a),(b)にて具体的に説明する
と、第3図(a)に示した輝度分布のピークP1,P2のx
軸上での位置と、第3図(b)での分散係数の谷P5,P6
のx軸上での位置とが一致するので、その領域に白線が
存在するということである。第3図(c),(d)につ
いても、同様のことが言える。This will be specifically described with reference to FIGS. 3A and 3B. X of peaks P 1 and P 2 of the luminance distribution shown in FIG.
Positions on the axis and valleys P 5 and P 6 of the dispersion coefficient in FIG. 3 (b)
Since the position on the x-axis matches with the position, a white line exists in that region. The same can be said for FIGS. 3 (c) and 3 (d).
以上説明したように、本実施例によれば、走行路面の
画像から輝度情報を抽出し、同時にその画像から走行路
面の分散を求めて両者の特徴点を比較することで、白線
を抽出するために必要となる補正の処理量を少なくでき
るので、走行路上の白線の位置を効率よく、かつ高速に
認識できるという効果がある。As described above, according to the present embodiment, the luminance information is extracted from the image of the traveling road surface, and at the same time, the variance of the traveling road surface is obtained from the image and the feature points of the two are compared to extract the white line. Therefore, the amount of correction required for the process can be reduced, so that the position of the white line on the traveling road can be recognized efficiently and at high speed.
尚、本発明は上記実施例に限定されるものではなく、
例えば、白線の認識確度を上げるため、画像間の相関を
とつたり、白線の直線性を検証してもよい。The present invention is not limited to the above embodiment,
For example, in order to increase the recognition accuracy of the white line, the correlation between the images may be taken or the linearity of the white line may be verified.
(発明の効果) 以上説明したように、本発明によれば、走行路面の陰
や照り返しに影響されることなく、走行路端を高速に認
識することができるという効果がある。(Effects of the Invention) As described above, according to the present invention, there is an effect that a road end can be recognized at high speed without being affected by the shadow or reflection of the road surface.
第1図は本発明に係る一実施例である移動車の走行路端
検出装置の構成を示すブロツク図、 第2図はビデオカメラ1にて捕らえた車両前方の画像を
示す図、 第3図(a),(c)は画像から抽出した輝度分布を示
す図、 第3図(b),(d)は画像の分散係数を示す図であ
る。 図中、1……ビデオカメラ、2……画像入力部、3……
分散計算部、4……輝度計算部、5……白線検出部、6
……主制御部である。FIG. 1 is a block diagram showing a configuration of a traveling road edge detecting device for a moving vehicle according to an embodiment of the present invention, FIG. 2 is a diagram showing an image in front of the vehicle captured by a video camera 1, and FIG. (A) and (c) are diagrams showing the luminance distribution extracted from the image, and FIGS. 3 (b) and (d) are diagrams showing the dispersion coefficient of the image. In the figure, 1 ... video camera, 2 ... image input unit, 3 ...
Dispersion calculation unit, 4 ... Brightness calculation unit, 5 ... White line detection unit, 6
... This is the main control unit.
───────────────────────────────────────────────────── フロントページの続き (56)参考文献 特開 平1−232401(JP,A) 特開 平2−162405(JP,A) (58)調査した分野(Int.Cl.6,DB名) G06T 1/00 G05D 1/02 ────────────────────────────────────────────────── ─── Continuation of the front page (56) References JP-A-1-232401 (JP, A) JP-A-2-162405 (JP, A) (58) Fields investigated (Int.Cl. 6 , DB name) G06T 1/00 G05D 1/02
Claims (4)
動車の走行路端検出装置であって、 入力画像の輝度情報を抽出する輝度抽出手段と、 入力画像の分散情報を抽出する分散抽出手段と、 前記輝度情報と分散情報とに基づいて、走行路上の白線
を検出する検出手段とを備えることを特徴とする移動車
の走行路端検出装置。An apparatus for detecting a roadside end of a mobile vehicle, comprising image input means for external world recognition, comprising: a luminance extracting means for extracting luminance information of an input image; and a distributed means for extracting distributed information of the input image. A traveling road edge detection device for a mobile vehicle, comprising: an extraction unit; and a detection unit that detects a white line on a traveling road based on the luminance information and the dispersion information.
と所定閾値とを比較して得られる輝度分布に基づいて前
記輝度情報を抽出することを特徴とする請求項1に記載
の移動車の走行路端検出装置。2. The vehicle according to claim 1, wherein said luminance extracting means extracts said luminance information based on a luminance distribution obtained by comparing a luminance of said input image with a predetermined threshold. Road end detection device.
と所定閾値とを比較して得られる分散係数に基づいて前
記分散情報を抽出することを特徴とする請求項1に記載
の移動車の走行路端検出装置。3. The mobile vehicle according to claim 1, wherein the variance extracting unit extracts the variance information based on a variance coefficient obtained by comparing a variance of the input image with a predetermined threshold. Road end detection device.
前記入力画像の輝度と所定閾値とを比較して輝度分布を
算出し、前記分散抽出手段は、前記分散情報として前記
入力画像の分散と所定閾値とを比較して分散係数を算出
し、前記検出手段は、前記輝度分布と分散係数とを比較
して、該輝度分布が大きく、且つ該分散係数が小さい領
域を走行路上の白線として検出することを特徴とする請
求項1に記載の移動車の走行路端検出装置。4. The luminance extracting means calculates a luminance distribution by comparing the luminance of the input image with the predetermined threshold value as the luminance information, and the variance extracting means calculates a variance of the input image as the variance information. A variance coefficient is calculated by comparing the variance coefficient with a predetermined threshold value, and the detecting unit compares the luminance distribution with the variance coefficient, and detects a region where the luminance distribution is large and the variance coefficient is small as a white line on a traveling road. The traveling road edge detection device for a mobile vehicle according to claim 1, wherein:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2257218A JP2962799B2 (en) | 1990-09-28 | 1990-09-28 | Roadside detection device for mobile vehicles |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2257218A JP2962799B2 (en) | 1990-09-28 | 1990-09-28 | Roadside detection device for mobile vehicles |
Publications (2)
Publication Number | Publication Date |
---|---|
JPH04137014A JPH04137014A (en) | 1992-05-12 |
JP2962799B2 true JP2962799B2 (en) | 1999-10-12 |
Family
ID=17303309
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2257218A Expired - Lifetime JP2962799B2 (en) | 1990-09-28 | 1990-09-28 | Roadside detection device for mobile vehicles |
Country Status (1)
Country | Link |
---|---|
JP (1) | JP2962799B2 (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3373773B2 (en) | 1998-01-27 | 2003-02-04 | 株式会社デンソー | Lane mark recognition device, vehicle travel control device, and recording medium |
EP1504276B1 (en) | 2002-05-03 | 2012-08-08 | Donnelly Corporation | Object detection system for vehicle |
US7526103B2 (en) | 2004-04-15 | 2009-04-28 | Donnelly Corporation | Imaging system for vehicle |
JP4723209B2 (en) * | 2004-06-10 | 2011-07-13 | 川崎重工業株式会社 | Temperature measuring method and apparatus for carrying out the method |
JP4769594B2 (en) * | 2006-02-20 | 2011-09-07 | トヨタ自動車株式会社 | Road section line detection apparatus and method, and program |
WO2008024639A2 (en) | 2006-08-11 | 2008-02-28 | Donnelly Corporation | Automatic headlamp control system |
JP6126849B2 (en) * | 2013-01-25 | 2017-05-10 | 株式会社メガチップス | Lane identification device and lane identification method |
JP6354963B2 (en) * | 2016-09-09 | 2018-07-11 | 本田技研工業株式会社 | Object recognition apparatus, object recognition method, and object recognition program |
-
1990
- 1990-09-28 JP JP2257218A patent/JP2962799B2/en not_active Expired - Lifetime
Also Published As
Publication number | Publication date |
---|---|
JPH04137014A (en) | 1992-05-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP3083918B2 (en) | Image processing device | |
US7729516B2 (en) | Ranging device utilizing image processing | |
JP3759429B2 (en) | Obstacle detection apparatus and method | |
US8005266B2 (en) | Vehicle surroundings monitoring apparatus | |
JP4157620B2 (en) | Moving object detection apparatus and method | |
JP2004117078A (en) | Obstacle detection device and method | |
JP3656056B2 (en) | Interrupting vehicle detection device and method | |
JP2002228423A (en) | Tire detecting method and device | |
JP2962799B2 (en) | Roadside detection device for mobile vehicles | |
JPH11139225A (en) | Tunnel detector device and vehicle control device using it | |
JP3550874B2 (en) | Monitoring device | |
JPH06341821A (en) | Travel lane detector | |
JPH11195127A (en) | Method for recognizing white line and device therefor | |
JP2829934B2 (en) | Mobile vehicle environment recognition device | |
JP3868915B2 (en) | Forward monitoring apparatus and method | |
JP2003076987A (en) | Preceding vehicle recognizing device | |
JP3319401B2 (en) | Roadway recognition device | |
JP3157958B2 (en) | Leading vehicle recognition method | |
JPH07114689A (en) | Method for recognizing vehicle registered number | |
JP3915621B2 (en) | Lane mark detector | |
JP4070450B2 (en) | Forward vehicle recognition device and recognition method | |
JP3638028B2 (en) | Vehicle number recognition device | |
JP3104645B2 (en) | Road white line detection method and road white line detection device | |
JPH05313736A (en) | Obstacle recognizing device for moving vehicle | |
JP3230509B2 (en) | Moving image processing device |