CN108459595A - A kind of method in mobile electronic device and the mobile electronic device - Google Patents
A kind of method in mobile electronic device and the mobile electronic device Download PDFInfo
- Publication number
- CN108459595A CN108459595A CN201710457159.XA CN201710457159A CN108459595A CN 108459595 A CN108459595 A CN 108459595A CN 201710457159 A CN201710457159 A CN 201710457159A CN 108459595 A CN108459595 A CN 108459595A
- Authority
- CN
- China
- Prior art keywords
- electronic equipment
- movement
- movement electronic
- mission area
- electronic device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims description 30
- 230000033001 locomotion Effects 0.000 claims abstract description 162
- 230000004888 barrier function Effects 0.000 claims description 36
- 230000006870 function Effects 0.000 description 10
- 238000010586 diagram Methods 0.000 description 7
- 238000010408 sweeping Methods 0.000 description 7
- 230000008901 benefit Effects 0.000 description 6
- 238000004891 communication Methods 0.000 description 6
- 238000012545 processing Methods 0.000 description 5
- 238000005259 measurement Methods 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 238000012544 monitoring process Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 239000000203 mixture Substances 0.000 description 3
- 238000003860 storage Methods 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000006378 damage Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 230000005611 electricity Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 230000000873 masking effect Effects 0.000 description 2
- 230000008054 signal transmission Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 208000027418 Wounds and injury Diseases 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 230000000149 penetrating effect Effects 0.000 description 1
- 238000005295 random walk Methods 0.000 description 1
- 238000012827 research and development Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0217—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with energy consumption, time reduction or distance reduction criteria
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
First movement electronic equipment includes wireless signal transceiver, image processor, locating module and motion module, and wireless signal transceiver obtains the image of the mission area including first movement electronic equipment acquired by camera;The characteristic information of first movement electronic equipment in the image in image processor identification mission area, and establish coordinate system for image;Locating module identifies position and the profile of first movement electronic equipment, and draw the map of mission area by comparing the characteristic information and coordinate origin of identified first movement electronic equipment on coordinate system;It is also communicatively coupled to the second mobile electronic device, sends map to the second mobile electronic device, and the second mobile electronic device receives the task location set on map from user;And motion module receives the task location from the second mobile electronic device, and according to the position and profile of task location and the first movement electronic equipment identified, planning path, and moved according to path.
Description
Technical field
The present invention relates to electronic device fields.Specifically, the present invention relates to intelligent robot system fields.
Background technology
Traditional sweeping robot is by the map autonomous positioning of scanning and movement or collision reaction deflecting random walk, together
When clean ground.Therefore, traditional sweeping robot is because drawing and location technology are immature or inaccurate, during the work time
Ground complicated state can not be judged completely, be susceptible to the case where losing position and direction.In addition, certain types are not due to having
Stationkeeping ability, can only be by the physical principle of collision reaction come deflecting, or even household items or robot itself can be caused to damage
Bad even personal injury, the problems such as being interfered to user.
Invention content
Mobile electronic device system described in the embodiment of the present invention, such as robot system are utilized mounted on indoor solid
Determine camera, carries out positioning and map making, and mission area is determined by the feedback of user, carry out path planning, in real time
Localization machine robot, to which robot walks according to the path of planning.Thus ground feelings can not be judged by solving robot
The problem of condition, present position and best mobile route.The tracing sensor of robot is replaced with fixed camera;With robot
The duplication of labour substitution people labour, save intelligent robot research and development and equipment cost and make an investment in sweeping, dust collecting mechanism
Cost.In addition, because camera is fixed, therefore also eliminate to be arranged and tire out caused by the camera of robot interior
Product error, to realize accurately positioning and mapping.
Robot system described in embodiment has accurately positioning and mapping, can improve the work of robot
Efficiency, while mitigating the live load of user.
A kind of first movement electronic equipment according to one embodiment includes wireless signal transceiver, image procossing
Device, locating module and motion module, wherein:The wireless signal transceiver is communicatively connected to be mounted on indoor camera shooting
Head is configured to obtain the image of the mission area including the first movement electronic equipment acquired by the camera;It is described
Image processor is communicatively coupled to the wireless signal transceiver, is configured to described in the image for identifying the mission area
The characteristic information of first movement electronic equipment;The locating module is communicatively coupled to described image processor, is configured to lead to
The characteristic information and coordinate origin established coordinate system and compare identified first movement electronic equipment are crossed, identification described first moves
The position of dynamic electronic equipment and profile, and draw the map of the mission area;The locating module is also communicatively coupled to
Two mobile electronic devices are configured to send the map to second mobile electronic device, and second mobile electron is set
It is standby to receive the task location set on the map from user;And the motion module be communicatively coupled to it is described fixed
Position module and second mobile electronic device, are configured to receive the task position from second mobile electronic device
It sets, and according to the position and profile, planning path, and root of the task location and the first movement electronic equipment identified
It is moved according to the path.
Optionally or alternatively, described wherein the image of the mission area further includes at least one first barrier
Image processor is additionally configured to the characteristic information of at least one first barrier in the image for identifying the mission area;Institute
State characteristic information and coordinate origin that locating module is additionally configured at least one first barrier described in comparison, described in identification at least
The position of one barrier and profile;The motion module be additionally configured to according to the task location, positioned described first
The position and profile of the position of mobile electronic device and at least one first barrier identified, planning path, and according to institute
Path is stated to be moved.
Optionally or alternatively, wherein the motion module is moving, and the camera acquires packet in real time
Include the image of the mission area of the first movement electronic equipment;Described image processor and the locating module are additionally operable to dynamically
Determine the characteristic information, position and the profile of the first movement electronic equipment.
Optionally or alternatively, the camera is vertically mounted on the ceiling of the mission area.
Optionally or alternatively, the first movement electronic equipment, further includes charging pile, wherein the charging pile
Including described image processor and the locating module.
Optionally or alternatively, first movement electronic equipment also may include sensor, and the sensor is by described
The second obstacle information around one mobile electronic device is sent to the motion module, and the motion module is additionally configured to adjust
The movement orientation of the first movement electronic equipment is to avoid second barrier.
Optionally or alternatively, the first movement electronic equipment, the sensor include ultrasonic sensor
And/or laser sensor.
Another invention according to the present invention, a method of in first movement electronic equipment, the first movement electricity
Sub- equipment includes wireless signal transceiver, image processor, locating module and motion module, the method includes:By can
It is communicably connected to the wireless signal transceiver mounted on indoor at least one camera, is obtained by described at least one
The image for the mission area including the first movement electronic equipment that camera is acquired;By being communicatively coupled to
The described image processor for stating wireless signal transceiver identifies the first movement electronic equipment in the image of the mission area
Characteristic information, and establish coordinate system for described image, and corresponding coordinate is assigned to each point in the mission area
Value;By being communicatively coupled to the locating module of described image processor, by comparing on the coordinate system established
The characteristic information and coordinate origin of the first movement electronic equipment identified, identify the position of the first movement electronic equipment with
Profile, and draw the map of the mission area;By being communicatively coupled to the locating module of the second mobile electronic device,
The map is sent to second mobile electronic device, and second mobile electronic device is received from user described
The task location set on figure;And by being communicatively coupled to the locating module and second mobile electronic device
The motion module, receives the task location from second mobile electronic device, and according to the task location and
The position of the first movement electronic equipment identified and profile, planning path, and moved according to the path.
Description of the drawings
The more complete understanding of the present invention is obtained by referring to the detailed description that associated drawings describe, in the accompanying drawings
Similar reference numeral refers to similar part.
Fig. 1 is shown where first movement electronic equipment and the second mobile electronic device according to an embodiment of the invention
The schematic diagram of system.
Fig. 2 shows the block diagrams of the processor in first movement electronic equipment according to an embodiment of the invention.
Fig. 3 A-3D show the schematic diagram of first movement electronic equipment according to an embodiment of the invention.
Fig. 4 shows the method flow diagram according to an embodiment of the invention in first movement electronic equipment.
Specific implementation mode
Embodiment one
Fig. 1 shows 100 and second mobile electronic device of first movement electronic equipment according to an embodiment of the invention
The schematic diagram of 120 place systems.
Referring to Fig.1, first movement electronic equipment 100 includes but not limited to sweeping robot, industrial automation robot, clothes
Business humanoid robot, Disaster Relief Robot of getting rid of the danger, underwater robot, robot for space, autonomous driving vehicle are first-class.When first movement electricity
Sub- equipment 100 is in field operation equipment such as autonomous driving vehicle, and camera by the unmanned plane for flying in overhead for example, can be taken
Band.In one embodiment, first movement electronic equipment 100 itself does not include camera.
Second mobile electronic device 120 includes but not limited to:Mobile phone, tablet computer, laptop, remote controler etc..It moves
Dynamic electronic equipment includes optionally operation interface.In an optional embodiment, mobile electronic device is mobile phone, operates boundary
Face is cell phone application.
Signal transmission form between first movement electronic equipment 100 and the second mobile electronic device 120 includes but unlimited
In:Bluetooth, WIFI, ZigBee, infrared, ultrasonic wave, UWB etc., in the present embodiment by taking signal transmission form is WIFI as an example into
Row description.
Mission area indicates that first movement electronic equipment 100 executes the place of task.For example, working as first movement electronic equipment
100 when being sweeping robot, and mission area indicates that sweeping robot needs the region that cleans.In another example when first movement electronics is set
Standby 100 be when getting rid of the danger Disaster Relief Robot, and mission area indicates that the Disaster Relief Robot of getting rid of the danger needs the place speedily carried out rescue work.
As shown in Figure 1, in one embodiment, first movement electronic equipment 100 includes wireless signal transceiver 102, figure
As processor 1040, locating module 1042 and motion module 106.In addition, first movement electronic equipment 100 further includes encoder
With Inertial Measurement Unit (Inertial Measurement Unit, IMU), this two parts by first movement electronic equipment 100,
Namely robot 100 is self-contained.In addition, first movement electronic equipment 100 can also include memory 110, data processor
1046, which can be with robot 100, can also be inside charging pile, optionally inside charging pile.Storage
This two functions of device 110, data processor 1046 are storage and processing image information, data information etc..Memory 110 can communicate
Ground is connected to processor 104.Data processor 1046 is located in processor 104.As shown in FIG. 2, data processor 1046
It is connected to image processor 1040 and path planning module 1044.
Wireless signal transceiver 102 is communicatively connected to be mounted on indoor camera 140.Wireless signal transceiver
102 are configured to obtain the image of the mission area including first movement electronic equipment 100 acquired by camera 140.Example
Such as, depression angle shooting is carried out to indoor plane using fixed camera 140 on the ceiling.It is wireless in camera 140
Communication module is connect with the wireless communication module 102 in first movement electronic equipment 100 by WIFI.
Image processor 1040 is communicatively coupled to wireless signal transceiver 102, is configured to the image in identification mission area
In first movement electronic equipment 100 characteristic information.In addition, image processor 1040, which is also flat image, establishes two-dimensional coordinate
System assigns corresponding coordinate value to each indoor point.Locating module 1042 is communicatively coupled to image processor 1040, matches
It is set to the characteristic information and coordinate origin for comparing identified first movement electronic equipment 100 on the coordinate system established, is known
The position of other first movement electronic equipment 100 and profile, and draw the map of mission area.
For example, the indoor plane figure under being shot in camera 140 reaches the image procossing in first movement electronic equipment 100
Device 1040 carries out processing analysis.The function of image processor 1040 is to identify the spy of the first movement electronic equipment 100 in image
Sign.Optionally, graphics processor 1040 may be used based on Scale invariant features transform (Scale Invariant Feature
Transform, SIFT) algorithm or accelerate robust feature (Speeded Up Robust Features, SURF) algorithm identification on
State feature.Using SIFT algorithms, need to store reference picture in the memory 110.Image processor 1040 identifies storage first
The key point of the object of reference picture in the memory 110 extracts SIFT feature, then by comparing in memory 110
The SIFT feature of each key point SIFT feature and freshly harvested image, then it is based on K k-nearest neighbors (K-Nearest
Neighbor KNN) matching characteristic, to identify the object in new images.SURF algorithm is to be based on approximate 2D Haar wavelet transforms
(Haar wavelet) is responded, and carries out image convolution using integral image (integral images), has been used and has been based on
Hessian matrixes estimate construction detection (Hessian matrix-based measure for the detector),
And use the description based on distribution sub (a distribution-based descriptor).Those skilled in the art can be with
Understand, image processor 1040 and locating module 1042 can be included in processor 104.Below with reference to Fig. 2 to processing
Device 104 is further described.
Locating module 1042 is communicatively coupled to image processor 1040, is configured to by establishing coordinate system and comparing institute
The characteristic information and coordinate origin of the first movement electronic equipment 100 of identification, the position of identification first movement electronic equipment 100 with
Profile, and draw the map of mission area.Locating module 1042 is also communicatively coupled to the second mobile electronic device 120, into one
Step is configured to send the map to the second mobile electronic device 120, and the reception of the second mobile electronic device 120 exists from user
The task location set on map.For example, locating module 1042 establishes plane right-angle coordinate to the flat image after processing,
It will be set as coordinate origin at charging pile place, corresponding coordinate value (X, Y) of each point in image marks first movement
The position of electronic equipment 100 and profile are realized and are formulated indoor map.The second mobile electronic device 120 of user, such as mobile phone
120 are connect with the wireless signal transceiver 102 of first movement electronic equipment 100 by WIFI.The APP of mobile phone 120, which is read, to be formulated
Good indoor map, user set task location using cell phone application on map, that is, what single or multiple needs cleaned
Position (corresponding X, Y coordinate information), the locating module 1042 of first movement electronic equipment 100 records this setting position information.
Meanwhile first movement electronic equipment 100 is searched first using ceiling camera 140 and its internal image processing module 1040 and is moved
Move electronic equipment 100, such as the image of sweeping robot 100, map where cooperation locating module 1042 is real-time to robot 100
In coordinate information (X, Y) positioned.
Then, motion module 106 is communicatively coupled to locating module 1042 and the second mobile electronic device 120, configuration
To receive the task location from the second mobile electronic device 120, and according to task location and the first movement identified
The position of electronic equipment and profile, planning path, and moved according to the path.
In addition, first movement electronic equipment 100, for example, robot 100 further includes robot encoder and inertia measurement mould
Block IMU obtains position and the posture of robot with auxiliary camera 140.Such as when robot is shielded, not in camera sight
When middle, encoder and IMU can also provide position and the posture of robot.For example, encoder can be used as odometer, pass through note
The rotation information for recording wheels of robot, comes the track that calculating robot 100 passes by.
Alternatively, or in addition to, the image of mission area further includes at least one first barrier.Image processor 1040
The characteristic information at least one first barrier being additionally configured in the image in identification mission area.As discussed above, image
Processor 1040 can use SIFT or SURF algorithm to carry out image recognition.Locating module 1042 is additionally configured to compare at least one
The characteristic information and coordinate origin of a first barrier identify position and the profile of at least one barrier.Obstacle avoidance module 1048
It is additionally configured to utilize the second mobile electronic device 120, such as 120 selected task of mobile phone according to user with motion module 106
Position and the wheel of position, the position of the first movement electronic equipment 100 positioned and at least one first barrier identified
Exterior feature, planning path, and moved according to the path.For example, when cleaning location information and robot location's information all confirm
Later, the path planning module 1044 in first movement electronic equipment 100 will cook up the optimal path of energy avoiding obstacles, will
Routing information reaches the motion module 106 of robot 100 by wireless network, then robot 100 sets out according to this path to clear
It sweeps the floor and a little begins a task with.
Alternatively, or in addition to, motion module 106 is moving, and it includes described that camera 140 acquires in real time
The image of the mission area of one mobile electronic device.Image processor 1040 and locating module 1042 are additionally operable to dynamically determine first
Characteristic information, position and the profile of mobile electronic device 100.
Alternatively, or in addition to, first movement electronic equipment 100 also may include sensor 108, and sensor 108 is by first
The second obstacle information around mobile electronic device 100 is sent to motion module 106.Motion module 106 is additionally configured to adjust
The movement orientation of first movement electronic equipment 100 is to avoid second barrier.It is appreciated that not because of the height installed
Together, the vertical view information that ceiling mounted camera 140 is shot, and the biography on first movement electronic equipment 100
What sensor obtained is head-up information, therefore obstacle information and robot captured by ceiling mounted camera 140
In the barrier of sensor may be different because there may be maskings.106 synthetic assembly of motion module taking the photograph on the ceiling
The second obstacle information that the first obstacle information and sensor as captured by first 140 are perceived carries out avoidance planning path.
Optionally or alternatively, sensor includes ultrasonic sensor and/or laser sensor.It is appreciated that the first barrier
It can be identical with the second barrier, can also be different.Therefore, camera 140 and sensor 108 can be mutually auxiliary
It helps.For example, when if any masking, in shielded part, robot 100 needs to lean on laser sensor, the supersonic sensing of itself
Device etc. carries out avoidance.
For example, in first movement electronic equipment 100, such as in 100 moving process of robot, first movement electronic equipment
Image processor 1040 in 100 and locating module 1042 using ceiling camera 140, to robot 100 into Mobile state with
Track, to ensure that 100 location information of robot is accurate, meanwhile, laser sensor, the ultrasonic sensor of the carrying of robot 100
Robot surrounding static, dynamic environment are detected, auxiliary avoids static, dynamic barrier and adjustment optimal path.
Optionally or alternatively, camera 140 is vertically mounted on the ceiling of mission area.The camera 140
Such as can be wide-angle camera, for example, fish-eye camera.Therefore, image processor 1040 uses the photograph shot by camera 140
When piece, it is only necessary to elevation information is considered, to carry out bi-directional scaling to captured image, so that it may to obtain the mesh in image
Actual range between mark (for example, first movement electronic equipment 100, such as robot and the first barrier).The elevation information
Can be acquired in altimeter either range unit such as laser range finder or infrared range-measurement system, which can be by
Measured by first movement electronic equipment 100, it can also be stored in advance in first movement electronic equipment 100.
Optionally or alternatively, camera 140 can also reuse monitoring camera.For example, the monitoring camera 140
It may be mounted on wall, then first movement electronic equipment 100 obtains the distance between target according to trigonometric function etc..Such as want
Using original monitoring camera, the terrain clearance for knowing the monitoring camera and angle lens are needed, to calculate figure
The ratio of picture, to obtain accurate dimensional values.
Optionally or alternatively, if the camera lens limited viewing angle of single camera, can be used simultaneously multiple be mounted on
Camera on ceiling, to cover wider array of region, or even the entire interior of covering.
Optionally or alternatively, first movement electronic equipment 100 further includes charging pile 160, and wherein charging pile 160 wraps
Include processor 104.For example, charging pile includes image processor 1040 and locating module 1042.
Embodiment two
Fig. 3 A-3D show the schematic diagram of first movement electronic equipment 100 according to an embodiment of the invention.Below
It is illustrated so that first movement electronic equipment 100 is robot 300 as an example.
With reference to Fig. 3 A-3D, robot 300 is mainly by main body 310, wireless transceiver 320, microprocessor 330, sensor
340 and 350 5 part of charging pile composition.User can be realized by wireless transceiver 320, microprocessor 330 and charging pile 350
Fixed point function in certain region assists main body 310 to complete specified feature operation by sensor 340.
With reference to Fig. 3 A and Fig. 3 B, main body 310 is the carrier of microprocessor 330 and sensor 340, and microprocessor 330 can
To assist the control unit of main body 310 to execute the functions such as mobile.
With reference to Fig. 3 C and Fig. 3 D, wireless transceiver 320 can by mobile electronic devices such as mobile phone, computers, send bluetooth,
The signals of communication such as WIFI, ZigBee, infrared, ultrasonic wave can also carry out location information with microprocessor 330 and charging pile 350
Interactive function, and location information is sent to microprocessor 330 in a manner of instructing.
With reference to Fig. 3 B, microprocessor 330 is located on main body 310, can receive the instruction of the transmission of wireless transceiver 320 simultaneously
The instruction received is handled into line program, by treated, result is transmitted to 310 control unit of main body, executes corresponding appoint
Business.Microprocessor 330 can also emit the signals of communication such as bluetooth, WIFI, ZigBee, infrared, ultrasonic wave to charging pile 350, in fact
Stake charge function is now returned automatically.
With reference to Fig. 3 A, sensor 340 can detect ambient enviroment in real time, realize intelligent barrier avoiding, signal of communication detection and hair
The function of penetrating.
With reference to Fig. 3 D, charging pile 350 can receive the signal of the transmitting of wireless transceiver 320, while can also receive micro- place
The signal of communication that reason device 330 is sent out, realizes location information interactive function, real to guide microprocessor 330 to control main body 310
Now return the function of stake charging.
Embodiment three
Fig. 4 shows the method flow diagram according to an embodiment of the invention in first movement electronic equipment.
Fig. 4 shows a kind of method 400 in first movement electronic equipment, wherein first movement electronic equipment 100 wraps
Include wireless signal transceiver 102, image processor 1040, locating module 1042 and motion module 106.This method 400 includes:
By being communicatively connected to the wireless signal transceiver 102 mounted on indoor camera 140 in block 410, obtain by taking the photograph
The image of the mission area including first movement electronic equipment 100 acquired as first 140;In block 420 by that can communicate
Ground is connected to the image processor 1040 of wireless signal transceiver 102, and the first movement electronics in the image in identification mission area is set
Standby 100 characteristic information;In block 430, by being communicatively coupled to the locating module 1042 of image processor 1040, pass through
It establishes coordinate system and compares the characteristic information and coordinate origin of identified first movement electronic equipment 100, identify first movement
The position of electronic equipment 100 and profile, and draw the map of the mission area;In block 440, by being communicatively coupled to
The locating module 1042 of two mobile electronic devices sends the map to the second mobile electronic device, and the second mobile electron is set
It is standby to receive the task location set on map from user;And in block 450, by being communicatively coupled to locating module
1042 and second mobile electronic device motion module 106, receive the task location from the second mobile electronic device, and according to
The position and profile of task location and the first movement electronic equipment 100 identified, planning path, and transported according to the path
It is dynamic.
Optionally or alternatively, the image of wherein mission area further includes at least one first barrier, and method 400 is also
Including (not shown):By image processor 1040, the first barrier of at least one of the image in identification mission area
Characteristic information;By locating module 1042, the characteristic information and coordinate origin of more at least one first barrier, identification is at least
The position of one barrier and profile;By motion module 106, according to task location, the first movement electronic equipment positioned
The position and profile of 100 position and at least one first barrier identified, planning path, and carried out according to the path
Movement.
Optionally or alternatively, method 400 further includes (not shown):It is moved by motion module 106;
Acquire the image of the mission area including first movement electronic equipment 106 in real time by camera;Pass through image processor
1040 and locating module 1042, dynamically determine characteristic information, position and the profile of first movement electronic equipment 100.
Optionally or alternatively, method 400 further includes (not shown):By being communicatively connected to the place
The encoder and inertia measuring module for managing device, assist the camera to obtain position and the appearance of the first movement electronic equipment
State.
Optionally or alternatively, camera is vertically mounted on the ceiling of the mission area.
Optionally or alternatively, wherein first movement electronic equipment 100 further includes charging pile 160, wherein charging pile
160 include image processor 1040 and locating module 1042.
Optionally or alternatively, wherein first movement electronic equipment 100 also may include sensor 108, and method 400 is also
The second obstacle information around first movement electronic equipment 100 is sent to by sensor 108 including (not shown)
Motion module 106;And the movement orientation of first movement electronic equipment 100 is adjusted to avoid the second barrier by motion module 106
Hinder object.
Optionally or alternatively, wherein the sensor includes ultrasonic sensor and/or laser sensor.
In description in front, the present invention is described by reference to specific illustrative embodiment;It will be appreciated, however, that
In the case of the range for not departing from the invention described herein, various modifications can be carried out and variation.The description and the appended drawings are answered
Treat in an exemplary fashion, rather than it is restrictive, and all such modifications are intended to be included in the scope of the present invention
It is interior.Therefore, the scope of the present invention should be by general embodiments as described herein and its legal equivalents rather than only by above-mentioned specific
Embodiment determines.For example, the step of described in any method or process embodiments can be performed in any order, and it is not limited to
The clear sequence presented in a particular embodiment.In addition, the component and/or element described in any device embodiment can be each
Kind arrangement assembling otherwise operatively configures, to generate with of the invention essentially identical as a result, being therefore not limited to specific
Concrete configuration described in embodiment.
Above benefit, other advantages and solution to the problem are described about specific embodiment;However, any benefit
Place, advantage or solution to the problem, or any particular benefits, advantage or scheme can be caused to occur or become more apparent upon any
Element is not necessarily to be construed as crucial, required or basic feature or component.
As it is used herein, the terms "include", "comprise" or its any modification are intended to reference non-exclusive inclusion, make
Process, method, article, composition or the device that must include element list include not only those of described element, but also can also
Including not expressly listed or intrinsic main process, method, article, composition or device.In addition to that being not specifically delineated
Except a little, other groups of the above structure, layout, application, ratio, element, material or the component that use in the practice of the invention
Conjunction and/or modification can be changed, or otherwise especially suitable for specific environment, manufacture specification, design parameter or
Other operations require, without departing from its substantially principle.
Although describing the present invention by reference to certain preferred embodiments herein, those skilled in the art are by easy reason
Solution, without departing from the spirit and scope of the present invention, other application can substitute those of described here.Therefore,
The present invention is only limited by following claims.
Claims (16)
1. a kind of first movement electronic equipment, including wireless signal transceiver, image processor, locating module and movement mould
Block, wherein:
The wireless signal transceiver is communicatively connected to be mounted on indoor at least one camera, is configured to obtain by institute
State the image for the mission area including the first movement electronic equipment that at least one camera is acquired;
Described image processor is communicatively coupled to the wireless signal transceiver, is configured to identify the image of the mission area
In the first movement electronic equipment characteristic information, and establish coordinate system for described image, and in the mission area
Each point assign corresponding coordinate value;
The locating module is communicatively coupled to described image processor, is configured to compare on the coordinate system established and be known
The characteristic information and coordinate origin of other first movement electronic equipment identify position and the wheel of the first movement electronic equipment
Exterior feature, and draw the map of the mission area;
The locating module is also communicatively coupled to the second mobile electronic device, is configured to second mobile electronic device
The map is sent, and second mobile electronic device receives the task location set on the map from user;With
And
The motion module is communicatively coupled to the locating module and second mobile electronic device, is configured to receive and
From the task location of second mobile electronic device, and according to the task location and the first movement identified
The position of electronic equipment and profile, planning path, and moved according to the path.
2. first movement electronic equipment according to claim 1, wherein the image of the mission area further includes at least one
First barrier,
Described image processor is additionally configured to the spy of at least one first barrier in the image for identifying the mission area
Reference ceases;
The locating module is additionally configured to the characteristic information and coordinate origin of at least one first barrier described in comparison, identifies institute
State position and the profile of at least one barrier;
The motion module be additionally configured to according to the task location, the first movement electronic equipment positioned position and
The position of at least one first barrier identified and profile, planning path, and moved according to the path.
3. first movement electronic equipment according to claim 1, wherein the motion module is moving, and it is described at least
One camera acquires the image for the mission area for including the first movement electronic equipment in real time;
Described image processor and the locating module are additionally operable to dynamically determine the spy of the first movement electronic equipment
Reference breath, position and profile.
4. first movement electronic equipment according to claim 1, further includes
It is communicatively connected to the encoder and inertia measuring module of the processor, is configured to aid at least one camera shooting
Head obtains position and the posture of the first movement electronic equipment.
5. first movement electronic equipment according to claim 1, wherein at least one camera is vertically mounted on institute
It states on the ceiling of mission area.
6. first movement electronic equipment according to any one of claims 1-5, further includes charging pile, wherein the charging
Stake includes described image processor and the locating module.
7. first movement electronic equipment according to any one of claims 1-5 also may include sensor, the sensor
The second obstacle information around the first movement electronic equipment is sent to the motion module, the motion module is also matched
It is set to and adjusts the movement orientation of the first movement electronic equipment to avoid second barrier.
8. first movement electronic equipment according to claim 7, the sensor includes ultrasonic sensor and/or laser
Sensor.
9. a kind of method in first movement electronic equipment, the first movement electronic equipment include wireless signal transceiver,
Image processor, locating module and motion module, the method includes:
By being communicatively connected to the wireless signal transceiver mounted on indoor at least one camera, obtain by institute
State the image for the mission area including the first movement electronic equipment that at least one camera is acquired;
By being communicatively coupled to the described image processor of the wireless signal transceiver, the image of the mission area is identified
In the first movement electronic equipment characteristic information, and establish coordinate system for described image, and in the mission area
Each point assign corresponding coordinate value;
By being communicatively coupled to the locating module of described image processor, by comparing on the coordinate system established
The characteristic information and coordinate origin of the first movement electronic equipment identified, identify the position of the first movement electronic equipment with
Profile, and draw the map of the mission area;
By being communicatively coupled to the locating module of the second mobile electronic device, sent out to second mobile electronic device
The map is sent, and second mobile electronic device receives the task location set on the map from user;And
By being communicatively coupled to the motion module of the locating module and second mobile electronic device, receives and
From the task location of second mobile electronic device, and according to the task location and the first movement identified
The position of electronic equipment and profile, planning path, and moved according to the path.
10. according to the method described in claim 9, the image of the wherein described mission area further includes at least one first barrier,
The method further includes:
By described image processor, the feature letter of at least one first barrier in the image of the mission area is identified
Breath;
By the locating module, the characteristic information and coordinate origin of at least one first barrier, described in identification
The position of at least one barrier and profile;
By the motion module, according to the task location, the position of the first movement electronic equipment positioned and institute
The position of at least one first barrier of identification and profile, planning path, and moved according to the path.
11. according to the method described in claim 9, further including:
It is moved by the motion module;
Acquire the image for the mission area for including the first movement electronic equipment in real time by least one camera;
By described image processor and the locating module, the feature of the first movement electronic equipment is dynamically determined
Information, position and profile.
12. according to the method for claim 11, further including:
By being communicatively connected to the encoder and inertia measuring module of the processor, at least one camera is assisted
Obtain position and the posture of the first movement electronic equipment.
13. according to the method described in any one of claim 9-12, wherein at least one camera is vertically mounted on institute
It states on the ceiling of mission area.
14. according to the method described in any one of claim 9-12, wherein the first movement electronic equipment further includes charging
Stake, wherein the charging pile includes described image processor and the locating module.
15. according to the method described in any one of claim 9-12, wherein the first movement electronic equipment also may include passing
Sensor, the method further include
The second obstacle information around the first movement electronic equipment is sent to the movement mould by the sensor
Block;And
The movement orientation of the first movement electronic equipment is adjusted to avoid second barrier by the motion module.
16. according to the method for claim 15, wherein the sensor includes ultrasonic sensor and/or laser sensing
Device.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710457159.XA CN108459595A (en) | 2017-06-16 | 2017-06-16 | A kind of method in mobile electronic device and the mobile electronic device |
PCT/CN2018/090176 WO2018228258A1 (en) | 2017-06-16 | 2018-06-07 | Mobile electronic device and method therein |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710457159.XA CN108459595A (en) | 2017-06-16 | 2017-06-16 | A kind of method in mobile electronic device and the mobile electronic device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108459595A true CN108459595A (en) | 2018-08-28 |
Family
ID=63220337
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710457159.XA Pending CN108459595A (en) | 2017-06-16 | 2017-06-16 | A kind of method in mobile electronic device and the mobile electronic device |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN108459595A (en) |
WO (1) | WO2018228258A1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110162049A (en) * | 2019-05-22 | 2019-08-23 | 重庆大学 | Method for searching, device and storage medium |
CN110268225A (en) * | 2019-05-09 | 2019-09-20 | 珊口(深圳)智能科技有限公司 | The method of positioning device, server-side and mobile robot on map |
CN112087649A (en) * | 2020-08-05 | 2020-12-15 | 华为技术有限公司 | Equipment searching method and electronic equipment |
CN112797980A (en) * | 2020-12-02 | 2021-05-14 | 新石器慧义知行智驰(北京)科技有限公司 | Indoor unmanned vehicle guiding method and device and electronic equipment |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102866706A (en) * | 2012-09-13 | 2013-01-09 | 深圳市银星智能科技股份有限公司 | Cleaning robot adopting smart phone navigation and navigation cleaning method thereof |
CN205068153U (en) * | 2015-08-07 | 2016-03-02 | 浙江海洋学院 | Distributing type visual positioning system based on walking robot |
CN105467992A (en) * | 2015-11-20 | 2016-04-06 | 纳恩博(北京)科技有限公司 | Method and apparatus for determining path of mobile electronic equipment |
CN106595631A (en) * | 2016-10-25 | 2017-04-26 | 纳恩博(北京)科技有限公司 | Method for avoiding obstacles and electronic equipment |
CN206833252U (en) * | 2017-06-16 | 2018-01-02 | 炬大科技有限公司 | A kind of mobile electronic device |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20090025822A (en) * | 2007-09-07 | 2009-03-11 | (주)로봇에버 | Method and position data generator for self-localization of robot by using landmark and wireless lan, and robot by using same |
CN104460663A (en) * | 2013-09-23 | 2015-03-25 | 科沃斯机器人科技(苏州)有限公司 | Method for controlling cleaning robot through smart phone |
CN103926927A (en) * | 2014-05-05 | 2014-07-16 | 重庆大学 | Binocular vision positioning and three-dimensional mapping method for indoor mobile robot |
CN105204505A (en) * | 2015-09-22 | 2015-12-30 | 深圳先进技术研究院 | Positioning video acquiring and drawing system and method based on sweeping robot |
US20170108874A1 (en) * | 2015-10-19 | 2017-04-20 | Aseco Investment Corp. | Vision-based system for navigating a robot through an indoor space |
CN107490377A (en) * | 2017-07-17 | 2017-12-19 | 五邑大学 | Indoor map-free navigation system and navigation method |
-
2017
- 2017-06-16 CN CN201710457159.XA patent/CN108459595A/en active Pending
-
2018
- 2018-06-07 WO PCT/CN2018/090176 patent/WO2018228258A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102866706A (en) * | 2012-09-13 | 2013-01-09 | 深圳市银星智能科技股份有限公司 | Cleaning robot adopting smart phone navigation and navigation cleaning method thereof |
CN205068153U (en) * | 2015-08-07 | 2016-03-02 | 浙江海洋学院 | Distributing type visual positioning system based on walking robot |
CN105467992A (en) * | 2015-11-20 | 2016-04-06 | 纳恩博(北京)科技有限公司 | Method and apparatus for determining path of mobile electronic equipment |
CN106595631A (en) * | 2016-10-25 | 2017-04-26 | 纳恩博(北京)科技有限公司 | Method for avoiding obstacles and electronic equipment |
CN206833252U (en) * | 2017-06-16 | 2018-01-02 | 炬大科技有限公司 | A kind of mobile electronic device |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110268225A (en) * | 2019-05-09 | 2019-09-20 | 珊口(深圳)智能科技有限公司 | The method of positioning device, server-side and mobile robot on map |
WO2020223975A1 (en) * | 2019-05-09 | 2020-11-12 | 珊口(深圳)智能科技有限公司 | Method of locating device on map, server, and mobile robot |
CN110268225B (en) * | 2019-05-09 | 2022-05-10 | 深圳阿科伯特机器人有限公司 | Method for cooperative operation among multiple devices, server and electronic device |
CN110162049A (en) * | 2019-05-22 | 2019-08-23 | 重庆大学 | Method for searching, device and storage medium |
CN112087649A (en) * | 2020-08-05 | 2020-12-15 | 华为技术有限公司 | Equipment searching method and electronic equipment |
US11627437B2 (en) | 2020-08-05 | 2023-04-11 | Huawei Technologies Co., Ltd. | Device searching method and electronic device |
US11889386B2 (en) | 2020-08-05 | 2024-01-30 | Huawei Technologies Co., Ltd. | Device searching method and electronic device |
CN112797980A (en) * | 2020-12-02 | 2021-05-14 | 新石器慧义知行智驰(北京)科技有限公司 | Indoor unmanned vehicle guiding method and device and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
WO2018228258A1 (en) | 2018-12-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN207488823U (en) | A kind of mobile electronic device | |
US11314262B2 (en) | Autonomous platform guidance systems with task planning and obstacle avoidance | |
US10102429B2 (en) | Systems and methods for capturing images and annotating the captured images with information | |
CN108459596A (en) | A kind of method in mobile electronic device and the mobile electronic device | |
KR102670610B1 (en) | Robot for airport and method thereof | |
WO2019138836A1 (en) | Information processing device, information processing system, information processing method, and program | |
CN112740274B (en) | System and method for VSLAM scale estimation on robotic devices using optical flow sensors | |
CN207115193U (en) | A kind of mobile electronic device for being used to handle the task of mission area | |
CN108459597B (en) | Mobile electronic device and method for processing tasks in task area | |
US9183638B2 (en) | Image based position determination | |
KR100901311B1 (en) | Autonomous mobile platform | |
CN108459595A (en) | A kind of method in mobile electronic device and the mobile electronic device | |
CN108888187A (en) | A kind of sweeping robot based on depth camera | |
KR20160146379A (en) | Moving robot and controlling method thereof | |
KR20180070932A (en) | A movable object and a method for controlling the same | |
Singh et al. | Comparative analysis of range sensors for the robust autonomous navigation–a review | |
CN207067803U (en) | A kind of mobile electronic device for being used to handle the task of mission area | |
CN108544494B (en) | Positioning device, method and robot based on inertia and visual characteristics | |
WO2013145025A1 (en) | Stereo camera system and mobile object | |
CN206833252U (en) | A kind of mobile electronic device | |
Schneider et al. | On the accuracy of dense fisheye stereo | |
KR100581086B1 (en) | Method and apparatus for mobile robot localization using led of rfid tag | |
KR102048999B1 (en) | Autonomous driving devise and method | |
WO2019202878A1 (en) | Recording medium, information processing apparatus, and information processing method | |
JP7354528B2 (en) | Autonomous mobile device, method and program for detecting dirt on lenses of autonomous mobile device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180828 |