EP3484678A1 - Apparatus and methods for providing a reconfigurable robotic platform - Google Patents
Apparatus and methods for providing a reconfigurable robotic platformInfo
- Publication number
- EP3484678A1 EP3484678A1 EP17828284.4A EP17828284A EP3484678A1 EP 3484678 A1 EP3484678 A1 EP 3484678A1 EP 17828284 A EP17828284 A EP 17828284A EP 3484678 A1 EP3484678 A1 EP 3484678A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- service
- robot
- service module
- removable
- robotic platform
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 title claims abstract description 112
- 230000007246 mechanism Effects 0.000 claims description 37
- 238000003384 imaging method Methods 0.000 claims description 27
- 230000008859 change Effects 0.000 claims description 19
- 230000004044 response Effects 0.000 claims description 15
- 230000002547 anomalous effect Effects 0.000 claims description 3
- 230000003213 activating effect Effects 0.000 claims description 2
- 238000003780 insertion Methods 0.000 claims description 2
- 230000037431 insertion Effects 0.000 claims description 2
- 238000012544 monitoring process Methods 0.000 abstract description 6
- 230000002452 interceptive effect Effects 0.000 abstract description 4
- 238000012423 maintenance Methods 0.000 abstract description 4
- 238000012545 processing Methods 0.000 description 40
- 238000004140 cleaning Methods 0.000 description 38
- 230000006870 function Effects 0.000 description 34
- 238000011282 treatment Methods 0.000 description 34
- 230000008569 process Effects 0.000 description 30
- 230000000670 limiting effect Effects 0.000 description 28
- 230000004807 localization Effects 0.000 description 28
- 238000003860 storage Methods 0.000 description 25
- 230000015654 memory Effects 0.000 description 18
- 238000004891 communication Methods 0.000 description 16
- 230000033001 locomotion Effects 0.000 description 14
- 239000000463 material Substances 0.000 description 14
- 238000013439 planning Methods 0.000 description 14
- 239000012530 fluid Substances 0.000 description 13
- 238000001514 detection method Methods 0.000 description 11
- 238000013507 mapping Methods 0.000 description 9
- 230000007704 transition Effects 0.000 description 9
- 230000002745 absorbent Effects 0.000 description 8
- 239000002250 absorbent Substances 0.000 description 8
- 230000003068 static effect Effects 0.000 description 8
- 238000004458 analytical method Methods 0.000 description 7
- 238000007726 management method Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 6
- 238000005457 optimization Methods 0.000 description 6
- 230000009850 completed effect Effects 0.000 description 5
- 238000005259 measurement Methods 0.000 description 5
- 230000000116 mitigating effect Effects 0.000 description 5
- 238000004018 waxing Methods 0.000 description 5
- 230000007613 environmental effect Effects 0.000 description 4
- 230000004438 eyesight Effects 0.000 description 4
- 230000006855 networking Effects 0.000 description 4
- 238000010408 sweeping Methods 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 3
- 238000010801 machine learning Methods 0.000 description 3
- 239000004579 marble Substances 0.000 description 3
- 238000005498 polishing Methods 0.000 description 3
- 238000012559 user support system Methods 0.000 description 3
- 230000003044 adaptive effect Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 239000011538 cleaning material Substances 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 239000012636 effector Substances 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 238000009499 grossing Methods 0.000 description 2
- 230000001939 inductive effect Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 238000011268 retreatment Methods 0.000 description 2
- 230000011664 signaling Effects 0.000 description 2
- 239000002023 wood Substances 0.000 description 2
- 244000025254 Cannabis sativa Species 0.000 description 1
- 230000004308 accommodation Effects 0.000 description 1
- 230000003466 anti-cipated effect Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000004888 barrier function Effects 0.000 description 1
- -1 carpet Substances 0.000 description 1
- 230000003749 cleanliness Effects 0.000 description 1
- 230000001010 compromised effect Effects 0.000 description 1
- 238000010924 continuous production Methods 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000002950 deficient Effects 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000002955 isolation Methods 0.000 description 1
- 230000013011 mating Effects 0.000 description 1
- 238000003032 molecular docking Methods 0.000 description 1
- 238000010248 power generation Methods 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 230000002829 reductive effect Effects 0.000 description 1
- 238000002310 reflectometry Methods 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000000779 smoke Substances 0.000 description 1
- 239000002689 soil Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000009987 spinning Methods 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 230000003746 surface roughness Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000001931 thermography Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000001052 transient effect Effects 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
- 238000011179 visual inspection Methods 0.000 description 1
- 238000005406 washing Methods 0.000 description 1
- 230000003442 weekly effect Effects 0.000 description 1
- 238000005303 weighing Methods 0.000 description 1
- 230000003936 working memory Effects 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
- B25J19/023—Optical sensing devices including video camera means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/008—Manipulators for service tasks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/008—Manipulators for service tasks
- B25J11/0085—Cleaning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J5/00—Manipulators mounted on wheels or on carriages
- B25J5/007—Manipulators mounted on wheels or on carriages mounted on wheels
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/08—Programme-controlled manipulators characterised by modular constructions
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0044—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with a computer generated representation of the environment of the vehicle, e.g. virtual reality, maps
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
- G05D1/028—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using a RF signal
- G05D1/0282—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using a RF signal generated in a local control room
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L2201/00—Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L2201/00—Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
- A47L2201/06—Control of the cleaning action for autonomous devices; Automatic detection of the surface condition before, during or after cleaning
-
- E—FIXED CONSTRUCTIONS
- E01—CONSTRUCTION OF ROADS, RAILWAYS, OR BRIDGES
- E01H—STREET CLEANING; CLEANING OF PERMANENT WAYS; CLEANING BEACHES; DISPERSING OR PREVENTING FOG IN GENERAL CLEANING STREET OR RAILWAY FURNITURE OR TUNNEL WALLS
- E01H1/00—Removing undesirable matter from roads or like surfaces, with or without moistening of the surface
- E01H1/02—Brushing apparatus, e.g. with auxiliary instruments for mechanically loosening dirt
- E01H1/05—Brushing apparatus, e.g. with auxiliary instruments for mechanically loosening dirt with driven brushes
-
- E—FIXED CONSTRUCTIONS
- E01—CONSTRUCTION OF ROADS, RAILWAYS, OR BRIDGES
- E01H—STREET CLEANING; CLEANING OF PERMANENT WAYS; CLEANING BEACHES; DISPERSING OR PREVENTING FOG IN GENERAL CLEANING STREET OR RAILWAY FURNITURE OR TUNNEL WALLS
- E01H1/00—Removing undesirable matter from roads or like surfaces, with or without moistening of the surface
- E01H1/08—Pneumatically dislodging or taking-up undesirable matter or small objects; Drying by heat only or by streams of gas; Cleaning by projecting abrasive particles
- E01H1/0827—Dislodging by suction; Mechanical dislodging-cleaning apparatus with independent or dependent exhaust, e.g. dislodging-sweeping machines with independent suction nozzles ; Mechanical loosening devices working under vacuum
- E01H1/0854—Apparatus in which the mechanically dislodged dirt is partially sucked-off, e.g. dislodging- sweeping apparatus with dirt collector in brush housing or dirt container
-
- E—FIXED CONSTRUCTIONS
- E01—CONSTRUCTION OF ROADS, RAILWAYS, OR BRIDGES
- E01H—STREET CLEANING; CLEANING OF PERMANENT WAYS; CLEANING BEACHES; DISPERSING OR PREVENTING FOG IN GENERAL CLEANING STREET OR RAILWAY FURNITURE OR TUNNEL WALLS
- E01H1/00—Removing undesirable matter from roads or like surfaces, with or without moistening of the surface
- E01H1/10—Hydraulically loosening or dislodging undesirable matter; Raking or scraping apparatus ; Removing liquids or semi-liquids e.g., absorbing water, sliding-off mud
- E01H1/101—Hydraulic loosening or dislodging, combined or not with mechanical loosening or dislodging, e.g. road washing machines with brushes or wipers
- E01H1/103—Hydraulic loosening or dislodging, combined or not with mechanical loosening or dislodging, e.g. road washing machines with brushes or wipers in which the soiled loosening or washing liquid is removed, e.g. by suction
Definitions
- the present application generally relates to a robotic platform.
- the present application relates to apparatus and methods for providing a reconfigurable robotic platform with interchangeable service modules adapted to engage in both autonomous and interactive maintenance and surveillance of generally planar environments.
- the present disclosure describes a method and system for a reconfigurable robotic platform utilizing a plurality of interchangeable service modules and adapted to engage in both autonomous and interactive maintenance and monitoring of a service area, the robotic platform configured to perform a wide variety of tasks utilizing the plurality of interchangeable service modules, and navigating through the service area utilizing a plurality of sensors and guided through a stored service plan for the service area.
- a robot may comprise a propulsion mechanism, a positioning mechanism, and a removable service module adapted to perform a predefined service task.
- the positioning mechanism may comprise a LIDAR system, stereoscopic imaging system, RF location facility, and the like.
- the removable service module may comprise a power source.
- the removable service module may comprise at least one sensor.
- the at least one sensor may be an imaging component adapted to sense a condition of a generally planar surface proximate the robot.
- the at least one sensor may comprise a camera.
- the at least one sensor may comprise a LIDAR system.
- the predefined service task may comprise the servicing of a generally planar surface proximate the robot.
- the positioning mechanism may comprise a stereoscopic camera system.
- the predefined service task may comprise a single service task forming a part of a service plan comprising a plurality of service tasks where the plurality of service tasks are directed to a plurality of non-contiguous generally planar surfaces, where at least two of the plurality of non-contiguous generally planar surfaces may be separated by elevation.
- a method may comprise engaging a propulsion mechanism of a robot comprising a removable service module to position the robot at a predetermined location, and operating the removable service module to service a generally planar surface proximate the robot.
- Positioning the robot may comprise sensing an environment around the robot, and determining a position of the robot by comparing the sensed environment to a model of an environment. Sensing the environment may comprise utilizing a LIDAR system. Sensing the environment may comprise utilizing a stereoscopic image.
- the method may further comprise selecting a removable service module for insertion into the robot. Selecting the removable service module may comprise selecting the removable service module based, at least in part, on a service plan.
- the service plan may comprise a plurality of areas each with an associated defined removable service module for servicing each of the plurality of areas.
- the method may further comprise changing the removable service module in response to a change in a type of the generally planar surface proximate the robot.
- Changing the removable service module may comprise sliding the removable service module into the robot along at least two rails attached to the robot.
- Figure 1 illustrates operational functions and operational environment of a robotic platform according to exemplary and non- limiting embodiments
- Figure 2 illustrates perspective renderings of a robotic platform according to exemplary and non- limiting embodiments
- Figure 3 illustrates a perspective rendering of a robotic platform with removable service module according to an exemplary and non-limiting embodiment
- Figure 4 illustrates a functional block diagram of the components of a robotic platform and service module according to an exemplary and non-limiting embodiment
- Figure 5 illustrates a functional block diagram of the components of the robotic platform relating to a process of robot localization in an exemplary and non-limiting embodiment
- Figure 6 illustrates a multi-sensor pose estimation process according to an exemplary and non-limiting embodiment
- Figure 7 illustrates a functional block diagram of the components of the robotic platform relating to a process of sensing, obstacle avoidance, and path planning in an exemplary and non-limiting embodiment
- Figure 8 illustrates a service module comprised of a cleaning system in accordance with an exemplary and non-limiting embodiment
- Figures 9 - 33 illustrate graphical user interface display content in accordance with exemplary and non-limiting embodiments.
- Fig. 1 depicts a robotic platform 100 configured to perform a wide variety of tasks, such as in the servicing of multiple surfaces utilizing multiple service modules.
- the robotic platform 100 may be adapted to service a plurality of other environments, such as servicing outdoor traffic surfaces (e.g., cleaning roads and walkways), outdoor groomed landscapes (e.g., mowing lawns), industrial faculties (e.g., warehouses, truck loading areas, manufacturing facilities), and the like.
- outdoor traffic surfaces e.g., cleaning roads and walkways
- outdoor groomed landscapes e.g., mowing lawns
- industrial faculties e.g., warehouses, truck loading areas, manufacturing facilities
- the robotic platform 100 may operate in any area through which it can physically operate (e.g., small rooms, hallways, walkways, and the like).
- the size and function of the robotic platform 100 may be scaled to accommodate various application environments.
- a robotic platform 100 configured for cleaning large commercial spaces, such as an airport terminal may be large, with service modules utilizing correspondingly high capacity cleaning reservoirs (e.g., cleaning fluid) and resources (e.g., cleaning tool head).
- the robotic platform 100 may also be configured for residential applications, where the physical dimensions of the robotic platform 100 are sized appropriately to cleaning within the confines of a personal residence, utilizing correspondingly sized service modules, reservoirs, and cleaning head.
- the robotic platform 100 may be configured to be larger (e.g., wider, to maximize coverage).
- Fig. 1 depicts operational functions and operational environment of the robotic platform 100 according to exemplary and non-limiting embodiments, and is not meant to be exhaustive, but rather illustrative of particular operational features of the robotic platform.
- the robotic platform 100 includes a main body 214 that accommodates one of a plurality of service modules (102A-D), along with processing 106 capability as informed through a variety of navigational and resource sensors 104. Further detail of the robotic platform 100 is provided herein, but a key feature of the robotic platform is its ability to accommodate different service modules 102A-D, each of which may provide a different functional capability. For example, in an indoor floor cleaning application one module may be for vacuuming, another for rug washing, and still another for floor waxing.
- each service module may be optimized for its function rather than having some compromised subset of functions constrained within a single fixed functionality unit (e.g., a 'one-does-all' robot), or requiring separate individual units for each service capability (e.g., one robot for cleaning, another robot for vacuuming, and still another robot for waxing).
- Modularity enables the robotic platform 100 to optimally provide a wide variety of service functions with a single robotic platform.
- the robotic platform 100 through a main body 214 portion, provides common functions and processes for executing service tasks for the plurality of service modules 102A-D.
- the robotic platform 100 provides processing 106 capabilities (e.g., electronics, processing, software, and the like) to execute operational functionality.
- processing 106 may be provided in a distributed and/or shared-processing architecture between the main body 214 of the robotic platform 100 and the service modules 102A-D.
- the main processing capabilities for navigation and service plan execution may be predominately located in the main body 214 of the robotic platform 100, but some software, such as for motor drivers, and the like, may be unique to the module (e.g., located in the memory of the main body 214 of the robotic platform 100 or in memory of the service module 102A-D processing.
- there may be multiple levels of processing 106 such as a high-level processing module responsible for overall management of the robotic platform 100, an operational level processing module that takes navigation sensor input and collects and sends information up to high level processor for managing control, and finally a service module level processing module, where processing may be shared through the multiple levels (e.g., through shared processing between the main body processing and the service module processing).
- processing capabilities in the main body 214 may be configured to be customizable for a plurality of service module level processing modules.
- each service module 102A-D may have a unique identifier, and when the service module 102A-D is mated with the main body 214 the processing functionality in the main body may identify a module type, a tool type, a software version, and the like, by the unique identifier, and determine an action associated with this particular service module.
- a service module 102A may be a new version of a mopping service module, with updated software and/or hardware components.
- the processing functionality in the main body 214 may then have the capability to accommodate the updated mopping service module through customizable functions and messaging.
- software for one or more of the processing levels may be updateable, such as through wireless automatic updates.
- the software for top-level processing 106 may be updateable, such as to
- Processing 106 functionality interfaces with sensors 104, such as utilized for navigating (e.g., through imaging, laser ranging, sonar, external RF locating nodes 150A-C), sensing of surface characteristics (e.g., image recognition, surface reflectivity measurements, contact sensors), sensing internal operational parameters (e.g., reservoir levels, fluid cleanliness quality, performance characteristics), and the like.
- Sensors 106 may be located in the main body 214 of the robotic platform 100 and/or in the service modules 102A-D, such as dependent upon their use and function. For example, navigation sensors may be located in the main body 214 of the robotic platform 100 because they provide a resource that the robotic platform 100 will require independent of which service module 102A-D is in use.
- some sensing capabilities may be service module specific, and so located within the service module.
- Service module 102 A may, for example, require a sensing capability that service module 102B does not.
- service module 102A may be a rug cleaning service module with a cleaning fluid reservoir, where the cleaning fluid reservoir is monitored by sensors for quality (e.g., sensing cleanness quality of the fluid in the reservoir), level (e.g., sensing how much fluid remains), and the like.
- Service module 102B may be a vacuum service module with no cleaning fluid reservoir, but has a material collection container that is monitored for fullness.
- the capability for distributed processing 106 and sensors 104 maximizes the flexibility of the robotic platform 100 in the accommodation of a plurality of service modules 102A-D.
- the robotic platform 100 may provide a service planning facility 120, such as including the ability to generate and store service plans 128 for one or more task areas.
- a service plan 128 may utilize a mapping facility 122, such as with capabilities to generate and utilize digital 2D maps 124 and 3D maps 126 for navigating through and providing planning services for service areas 140A-B.
- Digital maps may provide for 2D-based service area layouts and features, such as dimensions of the area, surface characteristics, objects permanently present in the area, objects transient through the space, entrance and exit locations for servicing the area, and the like, enabling navigation and planning through the service area.
- a digital map for a task area may be divided up into a plurality of work areas, where operations through the plurality of work areas are coordinated. Dividing up a large task area into a plurality of work areas may increase computational efficiency by limiting the size of the digital map being presented for processing at any one time.
- the digital map for a large task area may be stored in memory of the robotic platform 100, but where the task area digital map is further divided into work areas and loaded into working memory in chunks or as a progressive load. Maintaining computational efficiency through dividing a task area digital map into a plurality of work area digital maps may enable the robotic platform 100 to maintain a higher level of spatial resolution processing for navigating and servicing the task area.
- a task area may be divided into different sizes or levels/sub-levels of work area based on the size of the task area, service treatment for different work areas, tools required for different work areas, service module requirements, and the like.
- Digital 3D maps may augment the utility of 2D maps by providing 3D-based service area features, such as low clearance areas, incline angles and locations, vertical transition locations between or within service areas (e.g., elevator, ramp), raised features in the service area (e.g., a bump, cable cover), and the like.
- the service planning facility 120 may be provided through a user interface of the robotic platform 100 or through an external computing facility (e.g., the user computing device 130).
- a service area mapping 142 of a service area may be generated by the robotic platform 100, such as through exploring the service area as part of a service area plan setup, or generated by an external mapping facility and downloaded to the robotic platform 100.
- a digital map may then be used in determining the movements of the robotic platform 100 throughout the service area, such as determined by a service plan 128.
- work areas described herein may exist on multiple levels (e.g., multi-floors of the same building), where each level has its own digital map and work area description.
- the robotic platform 100 may be able to move between levels using RF communication means to activate transition mechanisms (e.g., elevators, lifts) allowing movement between levels without human intervention.
- transition mechanisms e.g., elevators, lifts
- the robotic platform may be able to identify level transition points as a means of moving from work area to work area on different levels. Physically the work areas may be on different levels, but the robotic platform 100 may implement an overall service plan as one logical network of interconnected work areas through a limited number of access and level transition points.
- the robotic platform 100 may finish with a first work area on a first level and then utilize an elevator that is identified as a level transition point to proceed to a second work area on a second level.
- the robotic platform 100 may utilize an RF signal to call the elevator, enter the elevator on its own, send an RF signal to select the floor, and then exit the elevator when the proper floor has been reached.
- the robotic platform 100 may utilize elevator RF signaling protocols to communicate with an elevator system, such as where the elevator system includes an RF transceiver for communication with the robotic platform 100.
- Navigation through service areas 140A-B may utilize a combination of digital map usage (e.g., localization determined based on navigation through the mapped area) and real-time sensors (e.g., sensors 104 monitoring the robotic platform's surrounding environment).
- digital map usage e.g., localization determined based on navigation through the mapped area
- real-time sensors e.g., sensors 104 monitoring the robotic platform's surrounding environment
- a plurality of RF locator nodes 150A-C may be used to navigate and localize, as described herein, where the robotic platform 100 senses the location of the RF locator nodes 150A-C, such as in relation to their locations stored on the 2D digital map 124.
- the locations of the RF locator nodes 150A-C on the 2D digital map 124 may be determined in a number of ways, such as through direct input of the RF locators 150A-C into the map by a user who has determined the locations of the RF locator nodes 150A-C, the robotic platform 100 may determine the locations of the RF locator nodes 150A-C during a service plan setup process (e.g., determining the locations as the robotic platform is navigated around the space, either automatically or manually by a user), and the like.
- the locations of the RF locator nodes 150A-C may be determined through a node location facility that determines the location of each of a plurality of RF locator nodes 150A-C in an area.
- the node location facility may utilize a reference point transmitter that is placed in the area to determine a first set of location coordinates for the plurality of RF locator nodes 150A-C.
- the reference point transmitter is then moved to a second location to determine a second set of location coordinates, and then a third location to determine a third set of location coordinates.
- These sets of location coordinates may then be used to geometrically determine the location of the plurality of RF locator nodes 150A-C.
- These locations may then be input to the 2D-digital map 124 to automatically and accurately locate the plurality of RF locator nodes 150A-C on the 2D digital map for use by the robotic platform 100 in localization within the area.
- This procedure may be repeated for different areas with different sets of RF locator nodes, and represents a system for easily determining the locations of RF locator nodes for new areas, or for existing areas where RF locator nodes have been added (e.g., to cover a dead-spot) or moved.
- RF locator nodes 150A-C may be located across an extended service area, such as across a large area, in multiple adjacent areas, throughout a facility, and the like, where the robotic platform 100 utilizes the best set of RF locator nodes 150A-C available to it. For instance, in a large area such as an arrival or departure area in an airport terminal, there may be a plurality of RF locator nodes located throughout the area, and as the robotic platform 100 moves through the area it may use the best set of RF locator nodes 150A-C available, such as the closest nodes, the nodes with the best signal, the set of nodes that provide the best spatial positioning combination, and the like.
- the robotic platform 100 may switch RF locator nodes in a continuous process of optimization. For instance, when a robotic platform 100 enters a service area it may sense and utilize nodes 1, 2, and 3, but as it progresses through the area it may get farther from node 1 but closer to a new node 4, and at some point (e.g., determined by signal strength, by geometric layout, and the like) switch from using nodes 1, 2, and 3 to using nodes 2, 3, and 4, where switching is a handover from one set of nodes to another set of nodes, where the first and second set of nodes may or may not comprise an overlapping set of nodes.
- some point e.g., determined by signal strength, by geometric layout, and the like
- the robotic platform 100 may constantly assess and optimize the set of RF network nodes 150A- C it uses as it moves through a service area or through multiple service areas.
- RF network nodes 150A-C may augment the use of sensors 104 utilized for localization and navigation through work areas of a service plan.
- RF network nodes 150A-C may not be required, such as where the robotic platform 100 utilizes only onboard sensors 104 for localization and navigation.
- some service plans for facilities may not require RF network nodes 150A-C at all, and as such, no RF network nodes 150A-C may be installed in a service area 140A-B.
- RF network nodes 150A-C are most useful in operational situations where ambiguity may exist, such as in a very large or long room. As such, RF network nodes 150A-C may be selectively employed as required to eliminate such ambiguity. In some cases they may be the only technology that can resolve ambiguity. However, there are a plurality of navigation or localization systems available to the robotic platform, such as the RF network node system 150A-C, stereoscopic imaging sensors (e.g., 2D LIDAR, stereo camera), 3D LIDAR, dead reckoning, and the like.
- the robotic platform 100 may optimally combine information from any one or combination of the sensor 104 and RF network node 150A-C facilities. In embodiments, optimization may be implemented by dynamically weighting the availability and performance of each of the available sensor 104 and/or RF network node 150A-C inputs.
- the robotic platform 100 may navigate through the service area through a combination of sensor-based position estimation and positional predication based on the physical movements of the robotic platform 100. For instance, positional encoders from the propulsion mechanisms may provide odometry data for positional measurements, and based on kinematic algorithms and the plan for movement (e.g., per the service plan and 2D digital map), may be used to predict where the robotic platform 100 will be in time. This prediction may then be compared with a combination of sensor-based position measurements, such as from the LIDAR, camera, IMU, and the like. This comparison may then be probabilistically analyzed to determine a best estimate for the current position of the robotic platform 100.
- positional encoders from the propulsion mechanisms may provide odometry data for positional measurements, and based on kinematic algorithms and the plan for movement (e.g., per the service plan and 2D digital map), may be used to predict where the robotic platform 100 will be in time. This prediction may then be compared with a combination of sensor-based
- the robotic platform 100 may utilize this process in navigating through sensor- augmented dead reckoning or other localization mechanism such as stereoscopic imaging sensors, 3D LIDAR, and the like.
- Service plans 128 store the details for providing service to a service area 140A-B, such as mapping for navigation around the service area, features of the service area and how to navigate around them, a schedule for servicing, service modules to be used, consumable resources, order of service modules, and the like.
- Service plans 128 may include service details for one or more different service areas.
- the service plan may include a plurality of service areas, such as service area 140A and a service area 140B.
- Service area 140A and service area 140B may have different layouts and surface
- service area 140A being an interior area with a linoleum floor and service area 140B being an indoor-outdoor area with a rug.
- Each area may have different fixed obstacles 146 (e.g., furniture) or may have service plan notes for activity within the area, such as the area being a high pedestrian traffic area or where furniture is regularly moved around.
- a task area may include different requirements for tracking and treating around the periphery of a work area.
- a work area may have a hard boundary (e.g., a wall) or a soft boundary such as the boundary between two different treatment surface areas (e.g., between a rug and a tile floor).
- Treatment at a boundary may require different operational constrains in order to not apply service over the boundary (e.g., getting a wall or adjacent work area wet from a cleaning treatment), such as slowing down the movement of the robotic platform 100, slowing down the motion of a tool of a service module 102A-D, applying less cleaning material to the surface, applying a cleaning material with less pressure, and the like, as the robotic platform 100 navigates near the task area boundary.
- the service plans 128 may be initially generated during a service set-up, but may also be updated.
- a user may provide updates to the service plan 128.
- the robotic platform 100 may be enabled to update the service plan 128 based on new information gathered during regular service execution. For instance, the robotic platform 100 may use machine-learning to determine changes to service areas 140A-B, and incorporate those changes into a revised service plan (e.g., furniture has been moved from a location initially identified during setup of the service plan, to a new area).
- the robotic platform 100 may adapt a service plan sequence based on conditions encountered during service treatment. For instance, the robotic platform 100 may, during the cleaning of service area 140B, encounter a special treatment area 144 (e.g., a detected stain on the floor, a user-marked or spot-treated area on a floor, a high level of soiling requiring special treatment). The robotic platform 100 may sense the special treatment area (e.g., through surface sensing, imaging) or a user may input the location of the special treatment area 144 for special attention.
- a special treatment area 144 e.g., a detected stain on the floor, a user-marked or spot-treated area on a floor, a high level of soiling requiring special treatment.
- the robotic platform 100 may sense the special treatment area (e.g., through surface sensing, imaging) or a user may input the location of the special treatment area 144 for special attention.
- the robotic platform 100 may provide additional service to the special treatment area 144 (e.g., apply more pressure to the cleaner, slow down the speed of the robotic platform to increase the amount of cleaner provided, go over the area a second time, and the like) at the time the robotic platform 100 first encounters it or at a later time (e.g., scheduling a return to the area).
- the robotic platform 100 may encounter an obstacle 146 and adaptively go around it, or encounter a special treatment area 144 and adaptively skip the area for later treatment, such as at the end of the current task or with another service module type at a later time.
- the robotic platform 100 may store the location of the obstacle 146 or the special treatment area 144 for later treatment.
- the robotic platform 100 may perform an analysis with respect to postponed treatment areas.
- the robotic platform 100 may analyze the time required to return to an area against the benefit of returning to that location given the time constrains for the day. For instance, there may be number of areas that have been skipped, and there is not enough time to return to all of them, so the robotic platform 100 may perform an analysis to maximize the benefit in the time allocated for completion of the area service.
- the robotic platform 100 may encounter two areas that need special treatment 144 (e.g., one small and one large soiled area) and one obstacle 146 (e.g., a chair in the middle of the floor) that forced the robotic platform 100 to skip the area around the obstacle.
- the robotic platform 100 may have immediately determined that the special treatment area identified as a small re- treatment area should be re-treated immediately (e.g., because the time required for retreatment was below a maximum time threshold for immediate re-treatment), and as such, applied treatment a second time, such as in a repeat pass of the area.
- the second special treatment area was determined to take an amount of time that exceeded a maximum threshold of time for immediate re-treatment, and whose location was instead stored for possible later re-treatment.
- the robotic platform determined that the obstacle was a static obstacle and stored its location for possible later re-treatment.
- the robotic platform 100 may conclude from the analysis that it should return to the special treatment area because it will take an amount of time that is within a time constraint availability (e.g., there's enough time left in the work day to complete the re-treatment, there's enough charge left in the batteries to complete the re-treatment, and the like).
- a time constraint availability e.g., there's enough time left in the work day to complete the re-treatment, there's enough charge left in the batteries to complete the re-treatment, and the like.
- the analysis determines that the static obstacle represents a small re- treatment area it also determines that probabilistically the static obstacle will still be in place upon return (e.g., the chair will still be there), and in weighing the time available against the probability that the obstacle is still there, determines to not return for a retreatment at this time.
- the robotic platform 100 notes the area for extra treatment on a subsequent servicing of that area on another day. The robotic platform is thus adaptable to changing conditions and is able to store and analyze the need for changes in its service plan routine based on environmental changes encountered.
- the robotic platform 100 may utilize a service robot resource facility 110, such as at a 'home' location that provides resources, including service module material supply (e.g., consumable materials, cleaning heads), service module exchange 114, electric charging station 116, and the like.
- service module material supply e.g., consumable materials, cleaning heads
- service module exchange 114 e.g., electric charging station 116
- one or more of these functions may be automatic where the robotic platform 100 executes a function without user support, assisted where the robotic platform 100 performs some aspect of the function but with user support, or manual where the robot platform 100 returns to the service robot resource facility 110 for user execution of the function.
- the robotic platform 100 may wirelessly communicate with a user computing device 130 to notify the user (e.g., sending alarms and alerts to the user whenever needed to prompt the user to action, inform the user of a completed action, to change the replaceable service module, and the like).
- the user computing device 130 may provide for a user interface for communicating with, and monitoring the progress and performance of, the robotic platform 100.
- the robotic platform 100 may return to the service robot resource facility 110 and autonomously perform the function. For example, the robotic platform may return to the electric charging station 116 when its batteries are low or at the end of a service day.
- the electric charging station 116 may comprise a contactless charging facility that enables the robotic platform 100 to automatically charge its batteries while in proximity to the charging facility.
- the robotic platform 100 may automatically connect to a consumable resource port of a service module material supply 112, such as a station for filling or emptying fluids.
- the robotic platform 100 may service multiple service areas utilizing multiple service modules, such as in a coordinated process outlined in a stored service plan.
- the robotic platform 100 may have a continuous need to switch between service modules 102A-D.
- the service module exchange facility 114 may be located at a designated location where service modules 102B-D are stored while the robotic platform 100 provides a service with service module 102A.
- the robotic platform 100 may do so through aid of a user or automatically through the service module exchange facility 114 (e.g., a mechanism for automatically switching between service modules).
- the robotic platform 100 may wirelessly communicate with a user through a user computing device 130.
- the robot platform 100 may have completed vacuuming a rug covered service area 140B with service module 102A, and per a service plan sequence, is next directed to wash a linoleum covered service area 140 A with service module 102B.
- the robotic platform 100 may then automatically proceed to a pre-designated location for exchanging the service modules 102A-B, such as at the service robot resource facility 110, and simultaneously communicate a need for a level of user support to make the exchange through a message sent to the user computing device 130.
- the user may then assist the robotic platform in the exchange to whatever extent is required, such as the user manually exchanging the service modules, the user aiding a semi-automatic process for exchanging the service modules, the user monitoring an automatic exchange of the service modules, and the like.
- robotic platform 100 is adapted to travel over generally planar surfaces via the operation of a propulsion mechanism 202.
- Propulsion mechanism 202 may include a drivable wheel assembly or other mechanism capable of providing controlled motion of robotic platform 100.
- Robotic platform 100 further includes a top mounted imaging system 204 adapted to image the surrounding environment.
- Imaging system 204 may be comprised of a 3D LIDAR system adapted to produce three- dimensional point cloud information indicative of a surrounding environment.
- Such point clouds may be comprised of a 360-degree rotational sweep about the robotic platform 100 whereat, for each degree of sweep, there is incorporated data comprising imaging in the z- direction.
- the imaging system 204 may be comprised of a
- stereoscopic vision system adapted to produce a three-dimensional model of the surrounding environment. While described herein as incorporating, generally, a 3D LIDAR system, in practice either the 3D LIDAR system or the stereoscopic imaging system may be used alone or in combination.
- Robotic platform 100 may further comprise one or more ultrasonic sensors 206.
- Ultrasonic sensors 206 operate, generally, to detect near field objects in the direction of movement of the robotic platform 100 as described more fully herein.
- Robotic platform 100 may further comprise one or more 2D LIDAR systems 208. Each 2D LIDAR system operates, generally, to image a two-dimensional wedge formed by scanning a laser in front of the robotic platform in the direction of movement, such as forward motion relative to the orientation of the robotic platform, reverse, or to a side during turns.
- robotic platform 100 may include a camera 210 for providing images using visible light, near-IR, and IR wavelengths. Camera 210 may be adapted to image, generally, but not exclusively, in front of the robotic platform 100 in the direction of movement.
- Robotic platform view 100A illustrates the service module 102 (any of 102A- D) mounted inside a main body 214 of the robotic platform 100.
- service module 102A-D is an interchangeable and self-contained element that may be removed from robotic platform 100 and replaced by one or more other service modules 102A-D.
- Fig. 3 depicts a robotic platform view 100B with the service module 102 removed.
- Each service module 102A-D is but one of a plurality of types of service modules each directed to a distinct and defined operating mode, depending on the desired service.
- a service module 100 may be comprised of its own processor for managing the operation of the service module 100 and the various components forming a part of the service module 102.
- a service module 102 may incorporate its own power storage and/or power generation system for providing power to itself and/or to the robotic platform 100 of which it forms a part.
- each service module 102A-D may incorporate one or more sensors including, but not limited to, pressure sensors, moisture sensors, LIDAR systems, imaging systems, and the like, tailored to performing in an operating mode for which the service module 102A-D is designed.
- the main body 214 of the robotic platform 100 may be equipped with similar sensing and processing capabilities to those described in connection with the service module 102A-D.
- a service module 102 A adapted to provide cleaning services to a surface comprised of thick rug may incorporate a 2D LIDAR system for evaluating, either statically or dynamically, a two-dimensional profile of the rug to determine surface roughness or pile length.
- the same service module 102A may comprise an imaging system for sensing anomalous stains in the rug so as to apply additional cleaning solution.
- a service module 102B may include a series of sensors and mechanisms designed to buff and polish marble floors.
- the service module 102A-D provides functions appropriate to a particular task according to a desired mode of operation as inserted into robotic platform 100. As described more fully herein, the interchangeability of the service modules 102A-D may be achieved via human intervention or without direct human intervention in accordance with a defined automated operation regiment.
- an imaging system of the robotic platform 100 may incorporate a 3D LIDAR radar system and/or stereoscopic imaging system (e.g., 2D LIDAR or stereo cameras).
- a 3D LIDAR radar system and/or stereoscopic imaging system e.g., 2D LIDAR or stereo cameras.
- the 3D LIDAR and/or stereoscopic imaging system may be used, for example, to establish a static or semi-static mapping or
- the 3D LIDAR system and/or stereoscopic imaging system may be used to dynamically update an existing map, to perform localization, to perform pose estimation, to perform object and/or obstacle detection, to perform and verify obstacle mitigation, and the like.
- the 3D LIDAR and/or stereoscopic imaging system may be used to create an initial mapping, in 2D or 3D, of an environment in which the robotic platform 100 is intended to operate.
- the robotic platform 100 may be guided along a path or paths within a defined service area 140A- B in which it is to operate so as to create a 3D point cloud representing the three-dimensional structure of the environment.
- the 3D point cloud so produced may be algorithmically processed with or without human intervention to produce a three-dimensional model of the environment sufficient to permit the operation of the robotic platform 100 in accordance with a predefined or dynamically determined mode of operation.
- the three- dimensional model may be stored within a memory of the robotic platform 100 or made available from an auxiliary memory such as, for example, wireless communication to an external data server.
- the initial traversing of the environment to create the three-dimensional model may be performed autonomously by the robotic platform 100.
- the initial traversing may incorporate human guidance.
- a user may use a wireless joystick to guide the robotic platform 100 around a prospective service area 140A-B while creating a three-dimensional model.
- the creation of such a model may be offered as a service to customers.
- the three-dimensional model may be constructed by an apparatus not forming a part of the robotic platform 100 but otherwise communicated to the robotic platform 100, such as downloaded to the robotic platform 100 from an external computing facility.
- the model may be used to perform localization.
- localization refers to the process of determining a spatial location within a predefined environment.
- localization may make use of a plurality of sensors 104 and inputs at any time.
- 3D LIDAR and/or stereoscopic imaging system may be used in conjunction with ultra- wide band signaling via RF locator nodes 150A-C, when required, to determine a position of the robotic platform 100.
- the robotic platform 100 continually scans its environment to produce a localized 3D model of its environment.
- the robotic platform 100 may then compare this localized model to the predetermined three-dimensional model of the environment, such as via a statistical, best-fit methodology including, but not limited to, principle component analysis, regression, Kalman filtering and the like, in order to determine a position of the robotic platform 100.
- a statistical, best-fit methodology including, but not limited to, principle component analysis, regression, Kalman filtering and the like, in order to determine a position of the robotic platform 100.
- comparing need not be performed blind, but, rather, may incorporate predictive technology.
- the robotic platform 100 may utilize sensors including, but not limited to, inertial measurement units (IMUs), odometers, and the like, to predict changes in location between LIDAR scans.
- IMUs inertial measurement units
- odometers odometers
- FIG. 3 there is illustrated a view of the robotic platform 100B showing the vacant cavity where a service module 102A-D has been removed according to an exemplary and non-limiting embodiment.
- a service module 102A-D On either side of the cavity there may be an affixed rail.
- the rails may be adapted and positioned such that rollers attached to a service module 102A-D may utilize the rails when sliding into position. Because of the substantial weight of the main body 214, there is generated a considerable force pushing down and out on the sides of the unit that tends to place stress on the main body.
- the cavity may include a locking mechanism, a power connection, a signal connection, and the like, to enable electrical connection between the main body 214 and the service module 102A-D.
- Service module 102A-D may include a locking mechanism, a signal connection, a power connection, a tool controller, sliding mechanism, and the like. Each service module 102A-D may comprise a unique tool functional mechanism that may contain motors, actuators, brushes, fluid storage, and the like, as appropriate to the service module function. As described herein, the service module 102A-D may likewise include one or more sensors 106 associated with service module functioning such as current sensors, pressure transducers, location sensors, and the like.
- each service module 102A-D has a computer control system to modify its functions and its own software package.
- the service module 102A-D may be programmed for the function.
- processes such as those related to navigation, localization, task management/initialization, and the like, are performed using sensors and at least one processor 106A in the main body 214 while the service module 102A-D is programmed for application-specific functions.
- a service module 102A-D may comprise its own power source, such as a battery, so as not to burden the electrical load on the main body 214 of the robotic platform 100.
- service modules 102A-D when not in operation or attached to a main body 214, may be actively charging to provide near continuous power availability upon demand.
- charging may be performed in a contactless manner such as by, for example, inductive charging.
- inductive charging the robot platform 100 would monitor its state of charge and, when appropriate or as part of a pre-defined procedure, plan a route to a charging pad located on the floor at some pre-designated location. Upon stopping over or near the charging pad the automated charging process could begin.
- the main body 214 comprises user interface 402.
- the user interface 402 may include all elements for a user to conduct task planning and to operate the equipment including, for example, visual interface screen, element selection mechanism, on/off control, emergency stop and pause buttons, etc.
- the body 214 may further comprise a power and battery management system 408 which may include battery cells and any required battery management systems.
- the body 214 may further comprise a charging system 406 to connect or interface with facility power to directly or inductively change the batteries.
- the body 214 may further comprise a power distribution system 408 to provide electrical isolation and protection to the unit. Power distribution system 408 may convert and distribute battery power to voltages required by various system elements.
- the body 214 may further comprise a processor 106 A to execute software to plan, manage, navigate and execute functions of the body 214.
- the body 214 may further comprise a locomotion system 412 to control the drive mechanism to move the body 214 and a feedback control mechanism to implement guidance from the processing module 106 A.
- the main body 214 may further comprise a selection of various technology sensors 104 A to sense the internal and external environment of the main body 214. It may include numerous sensor technologies such as inertial, vision, laser radar, ultrasonic, electromagnetic or other types of sensor valuable for determining unit location, pose and condition.
- the main body 214 may further comprise an external communications module 416 to interface with external systems with radio frequency methods, direct digital methods, or audible methods.
- Service module 102A-D may comprise a processor 106B to interface with the main unit processing module 106 A to receive, plan, control and execute service module related tasks.
- Service module 102A-D may further comprise module sensors 104B to provide information related to service module tasks include such elements as pressures, distances, visual conditions, solution levels, etc.
- Service module 102A-D may further comprise one or more end effector control elements 422 to initiate, monitor, and control end effector 424 elements such as motors, actuators, and other functions associated with the module functioning for accomplishing the module's task, which may include a variety of functions such as with brushes, cleaning or polishing heads, vacuum heads, manipulator arms, etc.
- Service module 102A-D may further include a power management module 426 to receive power from the main body 214 or an optional auxiliary power supply 428 to appropriately condition and/or distribute it.
- an IMU 502 and rotary encoder 504 may provide inputs to an odometry and IMU extended Kalman filter 510, which along with other sensors, such as an ultra-wide band module 506 (e.g., utilizing signals communicated with RF locator nodes 150A-C) and a 2D laser scanner 508, are further provide input to a robot pose localization extended Kalman filter (global) 512.
- an ultra-wide band module 506 e.g., utilizing signals communicated with RF locator nodes 150A-C
- a 2D laser scanner 508 are further provide input to a robot pose localization extended Kalman filter (global) 512.
- the odometry and IMU extended Kalman filter 510 process may provide input, along with additional sensors such as a camera 210, to an odometry filtered and visual odometry from camera extended Kalman filter 516, which is then also provided as an input to the robot pose localization extended Kalman filter (global) 512.
- This layered, sensor-based localization process may then contribute to the generation of a path planner node 514, such as in the generation of a plurality of path planner nodes providing robot localization as the robotic platform 100 moves through the service area in the implementation of a service plan.
- the 3D LIDAR and/or stereoscopic imaging sensors may be used to perform pose estimation.
- pose estimation refers to localization with the added attribute of orientation, such as a compass heading indicative of a direction the robotic platform 100 is facing or tilt.
- localization and “pose estimation” are used interchangeably.
- individual sensors 606A-D of various types may comprise inputs to associated local smoothing filters 604A-D (analog or digital) to control signal and noise near the signal source.
- the outputs of the smoothing filters may form inputs to associated unique sensor type weighting and usage rules 602A-D forming a part of a pose estimation filter 602D.
- the pose estimation filter may output an integrated pose estimate.
- the ultrasonic sensors 108 may be used to continually perform fine adjustments of the robotic platform's 100 location and pose to enable near-continuous contact of tools forming a part of a service module 102A-D with a floor and/or a wall intersection.
- the 3D LIDAR and/or stereoscopic imaging sensors may be used to perform object detection and obstacle mitigation.
- 3D LIDAR or stereoscopic imaging sensors may be used, alone or in conjunction with, for example, ultrasonic sensors 106, to determine if an object is obstructing the motion of the robotic platform 100 along an intended path of travel.
- the robotic platform may stop to avoid the object and may further engage in one or more of a plurality of mitigation techniques in a hierarchical manner. For example, if the object can be identified, such as by visual inspection, LIDAR scanning or the like, it may be determined that the object is static and may be moved. In such instances, the robotic platform may employ an articulated appendage to move the object.
- the robotic platform may decide to transmit a communication, such as one containing a picture of an obstruction, to a supervisory person (e.g., through a user computing device 130) who may respond with instructions or may arrive on scene to remove the obstruction.
- the obstacle may be dynamic, such as a dog or person, and may respond to requests to move out of the way.
- the robotic platform 100 may autonomously act to alter its intended path. In other instances, the robotic platform may cease operation if conditions dictate.
- the robotic platform 100 may store instances of disrupted operations for further processing, such as to come back to that location later to cover an area that was skipped.
- strategies to deal with obstructions may depend upon a hierarchical decision-making process that receives input data from the robotic platform's 100 various sensors.
- the robotic platform 100 may have the ability to update a predetermined three-dimensional map based upon the repeated appearance of one or more obstacles 146.
- a robotic platform 100 used to polish the floors of an airport may proceed in accordance with a model that defines a first configuration of seats at a boarding gate.
- the seats may be interpreted by the robotic platform 100 as comprising obstacles.
- the robotic platform 100 may operate to ensure the reclassification of such obstacles as being enduring physical constructs forming, for example, a part of an updated three-dimensional model of the environment.
- the process of object detection and mitigation may vary depending, at least in part, upon the unique environment and conditions within which a robotic platform 100 is operating. For example, if a robotic platform 100 is mowing a lawn and sees a child, it may stop, issue a warning, and proceed around the child.
- the speed of the robotic platform 100 e.g., dictated by a plan
- the service module 102A-D e.g., dictated by the application
- object detection and avoidance may be handled in a variety of ways described herein.
- sensors such as a side panel capacitive sensor 702, wall following sensors 704, ultrasound sensors 706, 2D LIDAR 708, rear segmented LIDAR 710, camera 210, and the like, may provide input sensor sources for obstacle avoidance 720 that may then be provided to a global planner 722 for a plurality of planning algorithms, such as for a point A to point B algorithm 730, full coverage algorithm 732, spot cleaning algorithm 734, wall following algorithm 736, and the like.
- Algorithmic outputs may then be used for sensing, obstacle avoidance, and path planning for the robotic platform 100, such as when the robotic platform encounters an obstacle 146 that is either anticipated or unexpected with respect to the service plan being executed.
- a suite of tools to create, maintain, and attribute the three-dimensional model of the environment.
- the sensors on the robotic platform 100 may be employed to generate such a model.
- a point cloud may be obtained from a 3D LIDAR sensor.
- a stereoscopic imaging system may be employed to create a 3D model.
- one or more software modules may be employed to, for example, manipulate the raw point cloud and/or further attribute the resulting model.
- various areas forming a spatial part of the model may be attributed with multidimensional data.
- attributes may apply to any of a variety of defined points, areas, and volumes.
- a human operator or user may attribute areas of the three- dimensional model depending on surface type.
- This point cloud may be converted on the main unit or on an external processing system to two-dimensional representations of the area.
- a spatial representation such as a floor plan derived from the three-dimensional representation or separately derived can be used as the basis for describing surfaces types. In the representation, the user may be presented with a floor-map like display.
- the user is then able to identify work areas by defining area vertices. These defined work areas are then attributed with various information such as height of fixed obstacles or surface types. Examples of surface types include, but are not limited to, marble, carpet, wood, grass, etc. Other attributes may be temporal in nature. For example, areas that experience a high volume of foot traffic may be designated to be cleaned nightly whereas other less traveled areas may be designated to be cleaned weekly. Further, areas and points may be attributed to specify modes of operation that are to be performed upon them and the order in which different modes are to be performed. For example, an area comprised of a wood floor may be designated to be swept with a service module 102 A adapted to sweep prior to being polished by a service module 102B adapted to perform polishing. If developed on a separate processor, this attributed model may be downloaded to the robotic platform 100 for operation.
- the attributed model may be used as input to an algorithmic methodology for determining an efficient plan for traversing all required tasks.
- This methodology may consist of two parts, such as ordering the tasks to be performed and then describing the specific path plan to be taken to the different work areas and within the work areas.
- the robotic platform 100 may commence to follow the plan in an autonomous or semi- autonomous manner to complete the tasks specified in the plan.
- a robotic platform 100 may receive a plan calling for the vacuuming of a first area using a first service module 102 A and the buffing of a second area using a second service module 102B.
- the plan may not specify the precise route for the robotic platform 100 to follow to perform either task. Rather, the robotic platform 100 may utilize an on-board processor to determine an optimized path to follow to complete each task. In such instances, the path to be followed to exchange interchangeable service modules 112 may be determined on-board as well.
- an "optimal" or “optimized” path is a path determined to be preferable based upon one or more performance metrics.
- the robotic platform 100 may operate autonomously to detect a surface characteristic and (in some embodiments) select a service corresponding to the detected characteristic.
- a surface characteristic For example, an extendable spinning wheel may be used to measure mechanical resistance of a surface.
- reflectance patterns including color and directional intensity, may be sensed and used to identify a surface.
- an IR detector may detect attributes of vegetation and soil levels including, but not limited to, vegetation height and moisture level. These detected attributes might be used to determine, for example, a blade height or a chemical to be applied. Based on the surface type identified, the robotic platform 100 may alter its mode of operation, including its choice of service modules 112.
- a single service module 100 may, in response to sensing a surface characteristic, select a mode of operation and deploy to service the sensed area in accordance with the selected service and utilizing the appropriate service module 102A-D.
- the robotic platform 112 may make use of various static or dynamically updateable rules when performing tasks.
- a rule may specify that when sweeping and buffing of single surface is indicated, the sweeping mode of operation should always be performed prior to the buffing mode of operation.
- the robotic platform 100 may react to priority attributes or may impose priority based decision making in an autonomous fashion when completing tasks. For example, a plan may specifically indicate which areas are to receive priority. In other instances, the robotic platform 100 may decide, for example in the instance where time consuming obstacle avoidance or mechanical failure has been encountered, that it is not possible to complete the specified plan in the amount of time available to complete the tasks. In such instances, the robotic platform 100 may make autonomous decisions regarding the completion and deletion of various tasks or, conversely, may notify a human supervisor of an inability to complete the tasks as specified and may in response receive an updated plan.
- plans may be dynamically generated based on requirements. For example, a plan may be generated based on the directive to scrub all areas that can be scrubbed per a specified service plan. Such a directive may result in the production of a new plan to be transmitted to a robotic platform 100, or, conversely, the directive itself may be transmitted to a robotic platform 100 that proceeds to implement a current plan in accordance with the directive. Such as described, where it is apparent that a robotic platform 100 can avoid surfaces which are designated or attributed as not to be touched.
- the robotic platform 100 may make use of external data to inform the mode of operation. For example, if the robotic platform 100 receives information that the weather prior to the performance of a nightly floor cleaning is stormy or rainy, the robotic platform 100 may adjust the speed at which it performs sweeping of a floor area to take into the account the likely increased presence of dried dirt and mud. Likewise, external data sources for temperature readings and building management information may be accessed and utilized.
- the robotic platform 100 may be employed as a real-time reconnaissance apparatus.
- the robotic platform 100 may, in addition to performing tasks specified in the plan, perform security tasks.
- such tasks may be passive.
- a robotic platform 100 may utilize a camera sensor and facial recognition software to note the presence of people in the environment in which it operates. Such data may be stored for later analysis or may be used to transmit alerts to human personnel in real time.
- the robotic platform 100 may utilize any of its sensors to detect distinctive sounds such as the breaking of glass or the sound of voices.
- the robotic platform 100 may record, attribute, such as with a time stamp, and store such noises for later retrieval and use.
- attribute such as with a time stamp
- the robotic platform 100 may use near-IR sensors, IR sensors, thermal imaging sensors and/or thermal sensors to observe and record temperature variations and to act in response to the sensing of same. For example, if a robotic platform 100 senses the heat signature of footprints across a marble floor at midnight in an otherwise empty airport terminal, such information may be used to generate and transmit an alert to security personnel. Likewise, a robotic platform 100 may alert a user or human operator to the presence of smoke or fire. In other exemplary embodiments, the robotic platform 100 may sense various environmental parameters, the functioning of an HVAC system, and structural anomalies.
- robotic platforms 100 may act in concert to complete a plan.
- robotic platforms 100 may be enabled to engage in direct communication with one another or via an external server.
- a large environment such as an airport terminal may utilize multiple robotic platforms 100 running an overall coordinated service plan.
- Both robotic platforms 100 may for instance be running with a similar type of service module 102A, such as a vacuum service module, in order to service a large open space in a shorter amount of time.
- a first robotic platform working on a first surface area may finish before a second robotic platform working on a second surface area, and may then begin working on the second surface area to shorten the time for completing work of vacuuming the total area.
- the first robotic platform may switch to a new service module 102B, such as a waxing service module, and begin the process of waxing area that has already been vacuumed.
- Multiple robotic platforms may coordinate automatically, such as based on an established coordinated plan, or through a user, such as communicating through and being controlled by a user through a user computing device 130.
- a robotic platform 100 may be equipped with a 2D LIDAR system 208.
- 2D LIDAR systems may be utilized for localization, pose estimation, navigation, surface detection and/or object/obstacle detection, mitigation, and avoidance.
- the 2D LIDAR system 208 may be positioned or adapted to provide a forward facing cross section of the environment around the robotic platform 100 and, generally, in the direction of travel. Because of the line of sight requirement for LIDAR systems, the body of the robotic platform 100 may cause a blind spot for the LIDAR imaging sensor 204, such as immediately around the robotic platform 100 and proceeding in a generally downward conic section out, away and to the floor.
- the positioning of a 2D LIDAR system 208 as illustrated and described allows for LIDAR, ultrasonic or vision-based detection of objects inside of the blind spot.
- a robotic platform may implement an algorithm to match the sensed surrounding environment to a predetermined three-dimensional model in order to determine a present location.
- it may prove difficult to perform such localization using such a methodology alone. For example, consider the case of a long hallway with identical doors positioned equidistant from one another. There may be various positions along such a hallway where the surrounding three-dimensional environment appears identical or nearly identical.
- there may be employed a network of two or more ultra- wide band RF locator nodes 150A-C to aid with localization, pose estimation and navigation.
- software may be employed using the three-dimensional model as input to deduce likely dead spots in ultra- wide band coverage and to suggest optimal or near optimal locations for placing ultra-wide band signal transmitters.
- Ultra-wide band signals are capable of passing through walls and other solid barriers.
- Ultra- wide band RF locator nodes 150A-C may be placed at known positions in an environment and may transmit identifying signals.
- a robotic platform 100 may exchange signals with one or more RF locator nodes 150A-C to determine distance measurements between the robotic platform 100 and the anchors RF locator nodes 150A-C. Based, in part, on the reception of two or more such transmissions, a robotic platform 100 may apply trigonometry to determine a location relative to the transmitters having absolute locations which are predetermined or otherwise known.
- a robotic platform 100 may additionally determine a position in a vertical or "z" direction in addition to the "x" and "y" horizontal coordinates. As described more fully herein, robotic platform 100 may make use of a plurality of RF locator nodes 150A-C when performing localization. In such instances, the robotic platform 100 may employ a statistical process, such as Extended Kalman filtering, to merge the location data in a manner that seeks to maximize the accuracy of the combined data readings. Such methods may assign weighting or greater relevance to data sources known to produce more reliable readings. Data sources may be weighted by inherent or recent conclusions regarding sensor calibration or functionality, or applicability of the sensor technology in a particular operational environment. In other cases, weighting may reflect a predefined hierarchy based on the immediate situation as perceived by the analysis of other sensor data.
- the robotic platform may employ rules based decision-making process to determine when and if one or more location data sources should be used.
- the robotic platform 100 may operate in accordance with static or dynamically configurable default settings to determine when to incorporate different location data sources. For example, if, after a couple of seconds, the robotic platform 100 is unable to obtain a fix on its position using 3D LIDAR (or stereoscopic imaging sensors), it may switch to using ultra-wide band signals. The robotic platform 100 may subsequently switch back to using 3D LIDAR when it is feasible to do so. For example, it may be the case that the use of 3D LIDAR yields three possible locations for the robotic platform 100. The use of ultra-wide band signals may prove sufficient to resolve the ambiguity and the robotic platform 100 may proceed once again to use the 3D LIDAR for purposes of localization.
- 3D LIDAR or stereoscopic imaging sensors
- the robotic platform 100 may employ various rule-based algorithms to weight the relevance of differing sensors and to determine which sensors to use at any given moment. For example, it may be determined to weight, generally, the sonar data received from ultrasonic sensors 108 over the use of LIDAR. Another rule may indicate that, when the LIDAR is rendered deficient due, for example, to direct sunlight falling on the sensor, the robotic platform 100 should switch to more reliance the use of ultra- wide band signals. Over time, such rules may be altered via machine learning to more closely match the unique attributes of a specific environment. For example, a room that contains glass surfaces, such as aquariums, may negate the accurate operation of 3D LIDAR.
- tasks performed by the robotic platform 100 in the performance of a plan that requires cleaning such a room may proceed according to a rule that notes the preferred use of ultra- wide band signal localization when in the room.
- a rule may be embedded as an attribute of the area as specified in the three-dimensional model of the environment.
- a robotic platform 100 may change its operational capabilities through reconfiguration. Specifically, each robotic platform 100 is adapted such that individual service modules 102A-D may be removed and inserted in an interchangeable fashion. As a result, it is not necessary to operate, for example, five separate robotic platforms 100 in order to perform tasks comprising five different modes of operation, e.g., mowing, cutting, sweeping, polishing, and the like. Rather, the same robotic platform 100 may configure its operating capabilities by swapping one service module 102A-D for another, in accordance with a service plan that is either determined by the robotic platform 100 or otherwise communicated to it.
- a damp-wet mopping system 800 may be provided for the cleaning of floors, such as depicted in Fig. 8.
- the mop surface is comprised of a roll of absorbent material stored on a first roll 804 that may be pulled transverse across the width of mopping system 800 and stored on a second roll 802.
- the second roll 802 stores used absorbent material it may increase in size while the amount of absorbent material on first roll 804 decreases in size in complimentary fashion.
- a predetermined length of the absorbent material is pulled across the width of the damp-wet mopping system 800 and locked into place, such as at an agitator 808. Once locked, the absorbent material is placed in contact with a surface to be cleaned.
- the service module providing the damp-wet mopping function may operate to agitate the material in a circular or linear motion as the robotic platform 100 moves across the floor. After a predetermined amount of time or distance is traveled across the floor, the section of the absorbent material in contact with the floor may be pulled through and stored on the second roll 802 providing a clean length of absorbent material for use.
- the damp- wet mopping system 800 may comprise a cleaning fluid reservoir 806 for storing cleaning fluid.
- the floor cleaning service module that comprises the damp- wet mopping system may be in communication with a user through a user interface on a user computing device 130, as described herein.
- a user interface on a user computing device 130
- a non- limiting example of the user interface will be described, as illustrated in Figs. 9-33, whose references will be made in the foregoing illustrative description.
- an illustrative home screen 900 may provide the user with a view of a plurality of top-level user options, such as a 'user' area, an 'administrator' area, a 'service' area, and the like.
- a user may be able to select a name through a name screen 1000, such as to allow a plurality of users to interact through the user interface, such as to gain access to robot platform functionality through a password screen 1100.
- a function screen 1200 may provide for a listing of robot functions, such as for selecting a cleaning plan, for manually selecting tasks, selecting a destination, to establish manual control, to access current tool information, to access settings, and the like.
- a cleaning plan screen 1300 may provide for aspects of a plan, such as the title of a plan, the tools required for a plan, and the like. For instance, a cleaning plan may be for damp- wet mopping a floor, and the damp-wet mopping system 800 is listed as a required tool to be used.
- a listing of plans screen 1400 may provide for the tool used, the work area, the task area, the approximate time required for the task, and the like.
- the work area may be a series of rooms, the task area is in a hospital, and the approximate time for damp-wet mopping the rooms is one hour.
- An add new plans screen 1500 may be provided, such as listing the new plan, the tool to be used, the task to be performed, a listing of the plan, an 'add' button for adding the new plan, and the like.
- a home location screen 1600 may be provided that indicates, for instance, where the robot platform would start mopping the floor for the task area.
- control and/or indicators of various functions and tasks may be provided, such as for manual control of the robotic platform, for a home location being reached, and the like.
- a status -control screen 1700 may be provided, such as providing current status information (e.g., approximate time for task completion, location, tool, and the like).
- Function control buttons may be provided, such as to cancel or continue a current task.
- a time-down screen 1800 may be provided, such as for when a cleaning is set to begin, an approximate time for the task to be competed, and the like, and where a cancel function button may be available.
- a progress screen 1900 may be provided, such as with a visual or percentage indication as to how far along the task has progressed.
- a service plan progress screen 2000 may be provided, such as illustrating the progress complete in the current task, the progress complete for a plurality of tasks, and the like.
- the robot platform 100 may indicate to the user that a tool needs to be changed, such as through a tool change required screen 2100. For instance, the robot platform 100 may have completed the damp-wet mopping task and needs to transition to a waxing task. In this instance, a first service module 102 A may need to be switched for a second service module 102B. The need for switching tools may be indicated on an alert screen 2200, 2300, which may provide the user with the options to cancel cleaning, direct the robotic platform 100 to proceed to a service module exchange facility 114, and the like.
- a completion screen 2400 may be provided once the task or service plan is complete, such as providing a completion indication, that tasks have been complete, and the like, with options for viewing task completion options (e.g., viewing uncleaned locations, to conclude a task or plan).
- a completion screen 2500 may illustrate that the robotic platform 100 is returning to its home station.
- a map view illustration screen 2600 may be provided to indicate areas the robot platform 100 missed, such as skipping as a result of encountering an obstacle 146, needing to return to a special treatment area 144, and the like.
- a manual tasking screen 2700 may be provided, such as to specify a new task to be added.
- the task may be added to the current tasking for the day, to the service plan as a new task for subsequent days, and the like.
- a select destination screen 2800 may be provided, such as to send the robotic platform 100 to a specified location, floor, building, and the like.
- a plurality of screens may be provided to indicate status of the location change, such as a destination list screen 2900 provided to help identify the location to send the robotic platform 100, a selected location execution status screen 3000 to enable pausing or canceling the function, a system arrived screen 3100 to indicate completion of the location transfer, and the like.
- a manual control screen 3200 may be provided to initiate or cancel manual control of the robotic platform 100.
- a current tool screen 3300 such as showing an image of the tool, status for tool heads, hours run, command buttons for attending the service module (e.g., need to change a bag or a consumable fluid, changing a cleaning head), and the like.
- an image of the tool may be provided along with status indications, such as with respect to the absorbent material and cleaning fluid used.
- the present disclosure describes methods and systems for a reconfigurable robotic platform 100 utilizing a plurality of interchangeable service modules 102A-D and adapted to engage in both autonomous and interactive maintenance and monitoring of a service area.
- the robotic platform 100 may be configured to perform a wide variety of tasks utilizing the plurality of interchangeable service modules, and navigating through the service area utilizing a plurality of sensors and guided through a stored service plan for the service area.
- Various embodiments of the robotic platform 100 will now be described.
- the robotic platform 100 may provide for service plan navigation utilizing a plurality of RF transmitting location nodes, such as implementing navigation of a service plan 128 through service areas 140A-B utilizing a plurality of RF locator nodes 150A-C.
- RF locator nodes 150A-C may be utilized in navigation alone or in combination with other navigation sensors 104, such as 3D LIDAR or stereoscopic imaging sensors, especially when such sensors are constrained in some way by the surrounding environment. As described herein, RF locator nodes are not essential to navigation. Other methods such as vision or LIDAR may be employed so long as ambiguity of position can be resolved through on-board sensors 104.
- RF locator nodes 150A-C may be utilized to provide a reference grid for resolving the lack of knowledge of position ambiguity.
- the robotic platform 100 may comprise one of a plurality of removable service modules 102A-D and a propulsion mechanism 202 to position the robotic platform 100 at a plurality of
- the robotic platform 100 may provide for location and sensor-based detection of a transition between separately defined service areas 140A-B in a service plan 128, such as automatically detecting a surface change (e.g., rug to tile) at a location and determining that a tool change is required to transition from one service area 140A to another 140B as determined from the service plan 128.
- a surface change e.g., rug to tile
- the robotic platform 100 may sense an environment around a robotic platform 100 comprising a removable service module 102A-D, operating the removable service module to service a generally planar surface proximate the robotic platform 100 in accordance with a service plan 128, sensing a need to change the removable service module 102A-D in response to a change in a type of the generally planar surface proximate the robotic platform 100, such as where the change is defined in a service plan.
- the robotic platform 100 may provide for sensor-based detection and tracking of a service-area-edge-condition during service area plan 128 execution, such as detecting and tracking a wall or edge of a floor type during a service plan execution.
- the robotic platform 100 comprising a removable service module 102A-D may sense a surrounding environment, and operate the removable service module 102A-D to service a generally planar surface proximate the robotic platform 100 in accordance with a service plan 128 where sensing the environment comprises sensing at least one edge of at least one service area 140A-B defined in the service plan 128.
- the robotic platform 100 may provide for sensor-based detection of a service event condition within a single defined service area 140A, such as finding a location within an area that requires special attention beyond the routine service plan (e.g., a special treatment area 144) but for which there may exist an action protocol (e.g., spot treatment of the area).
- the robotic platform 100 may sense an environment around a robotic platform 100 comprising a removable service module 102A-D, and operate the removable service module 102A-D to service a generally planar surface proximate the robotic platform 100 in accordance with a service plan 128, and deviate from the service plan 128 in the event that an anomalous condition of the planar surface is detected.
- the robotic platform 100 may provide for sensor-based disruption detection and avoidance during execution of a service plan 128 in a service area 140A, such as the when the robotic platform 100 encounters an unplanned obstacle 146 (e.g., a chair out of place), avoids it, and continues service.
- the robotic platform 100 comprising a removable service module 102A-D may sense a surrounding environment, and operate the removable service module 102A-D to service a generally planar surface proximate the robotic platform 100 in accordance with a service plan 128, and deviate from the service plan 128 in the event that an obstacle 146 to the servicing of the planar surface is detected.
- the robotic platform 100 may store the location of the obstacle 146 as a location to return to in subsequent service.
- the robotic platform 100 may provide for operational service plan disruption and return optimization for a service robotic platform 100, such as when the robotic platform 100 encounters an obstacle 146 that forces it to skip a portion of a task, and establishes a return plan for subsequently returning (e.g., based on a calculated cost-trade for the obstacle relative to the overall service plan).
- the robotic platform 100 comprising a removable service module 102A-D may sense a surrounding environment, operate the removable service module 102A-D to service a generally planar surface proximate the robotic platform 100 in accordance with a service plan 128, deviate from the service plan 128 so as to suspend a performance of a task defined in the service plan 128, produce an updated service plan comprising at least one task directed to completing the suspended task, and operate the removable service module 102A-D to service the generally planar surface proximate the robotic platform 100 in accordance with the updated service plan.
- the robotic platform 100 may provide for an alert transmission of a detected service-module-change-condition to a wireless user interface, such as alerting an on-site user with a user computing device 130 that the robot needs to have a service module 102A-D manually changed out (e.g., at a service module exchange facility 114).
- the robotic platform 100 comprising a removable service module 102A-D adapted to perform a predefined service task may receive an alert, where the removable service module 102A-D is manually replaced in response, at least in part, to the alert.
- the robotic platform 100 may provide for automatic replacement of a service module 102A-D in a service module docking station (e.g., service robot resource facility 110), such as when the robotic platform 100 automatically replaces the service module 102A-D with limited user assistance.
- the robotic platform 100 comprising a removable service module 102A-D adapted to perform a predefined service task may receive an alert, and automatically replace the removable service module 102A-D in response, at least in part, to the alert.
- the robotic platform 100 may provide for operative service area planning for a robotic platform 100 through digital map annotation, such as providing manual service planning through annotating a digital map that has been loaded into the robotic platform 100.
- Digital maps may be 2D or 3D digital maps, generated in conjunction with operation of the robotic platform 100 or downloaded to the robotic platform from an external computing device (e.g., when the digital map was generated externally).
- the robotic platform 100 comprising a removable service module may receive at a service plan 128 comprising instructions for servicing a generally planar surface where the service plan 128 is derived, at least in part, from an annotated map, and operate the robot to service the generally planar surface.
- the robotic platform 100 may provide for identification of a surface type through a service area annotation in a stored digital map, such as annotating surface type through the stored digital map during service planning (e.g., the surface type is an indoor-outdoor rug, a linoleum tile, thick carpet).
- the robotic platform 100 comprising a removable service module may receive at a service plan 128 comprising instructions for servicing a generally planar surface where the service plan 128 is derived, at least in part, from an annotated map specifying a surface type, and operate the robot to service the generally planar surface.
- the robotic platform 100 may provide for multi-operative service area planning through a digital map utility, such as multi-area planning of different service areas 140A-B using a digital map loaded into the robotic platform 100.
- the robotic platform 100 comprising a removable service module may receive a service plan 128 comprising instructions for servicing a plurality of generally planar surfaces, and operate the robotic platform 100 to service the generally planar surfaces.
- the robotic platform 100 may provide for digital map utilization in the automatic execution of a service plan 128, such as the robotic platform 100 using the digital map to execute a stored service plan 128.
- the robotic platform 100 comprising a removable service module may receive a service plan 128 comprising instructions for servicing a generally planar surface of a service area 140A-B, and utilize a digital map of the service area 140A-B to operate the robotic platform 100 to service the generally planar surface.
- the robotic platform 100 may provide for real-time service area plan identification through utilization of a stored digital map and surface type sensors, such as the robotic platform 100 determining (e.g., in real-time) an execution plan in a service area 140A-B guided by a digital map and utilizing sensors 104 to detect a surface type and required service tool.
- the robotic platform 100 comprising a removable service module may sense a surrounding environment, derive a service plan 128 for servicing the environment based, at least in part, on the sensed environment and a digital map of the environment, and operating the removable service module 102A-D to service a generally planar surface proximate the robotic platform 100 in accordance with the service plan 128.
- the robotic platform 100 may provide for automatic service area mapping by a robotic platform 100, such as where the robotic platform 100 maps service areas 140A-B to produce a service map.
- the robot mapping may be an autonomous process, where the robotic platform 100 maps the service area(s) 140A-B by exploring the area on its own, or the robot mapping may be performed in conjunction with aid from a user, such as through a user operating the robotic platform 100 through manual control.
- the robotic platform 100 comprising a removable service module 102A-D may sense a surrounding environment, and produce a service map of the environment based, at least in part, on the sensed environment.
- the robotic platform 100 may provide for sensor-based adaptive service plan tasking, such as adjustment of a service plan task execution based on environment changes (e.g., unexpected obstacles and/or human presence).
- the robotic platform 100 comprising a removable service module 102A-D may sense a surrounding environment, operate the removable service module 102A-D to service a generally planar surface proximate the robotic platform 100 in accordance with a service plan, and deviate from the service plan so as to suspend a performance of a task defined in the service plan in response to a detected change in the sensed environment, producing an updated service plan comprising at least one task directed to completing the suspended task, and operating the removable service module 102A-D to service the generally planar surface proximate the robotic platform 100 in accordance with the updated service plan.
- the robotic platform 100 may provide for machine learning adaptive service plan tasking, such as the robotic platform 100 adjusting its plan tasking (e.g., from day to day) based on what it has learned from past service executions.
- the robotic platform 100 comprising a removable service module may sense a surrounding environment, and operate the removable service module 102A-D to service a generally planar surface proximate the robotic platform 100 in accordance with a predetermined service plan 128, where the service plan 128 incorporates information gathered by the robotic platform 100 during past operations of the service module 102A-D.
- the robotic platform 100 may provide for operational sensor suite optimization selection for power conservation in the robotic platform, such as for conserving power through selection of active sensors during service.
- the robotic platform 100 comprising a removable service module 102A-D may sense a surrounding environment using one or more active sensors 104 around the robotic platform 100, and operate the removable service module 102A-D to service a generally planar surface proximate the robotic platform 100 in accordance with a service plan 128 while selectively activating at least one of the active sensors 104.
- Selection of active sensors 104 may also provide optimized navigation performance, adjustment to environmental conditions, and the like.
- the robotic platform 100 may provide for operational sensor suite optimization selection based on changes in environmental conditions, such as changing the sensors 104 to improve performance in the presence of a changed environment (e.g., the environment may require an increased resolution from the sensor suite, such as for increased human presence, or the presence of a detected plurality of obstacles 146).
- the robotic platform 100 comprising a removable service module 102A-D may sense an environment using one or more active sensors 104 around the robotic platform 102, and operate the removable service module 102A-D to service a generally planar surface proximate the robotic platform 100 in accordance with a service plan 128 while selectively changing at least one of the active sensors 104 based on a sensed change in the environment.
- the robotic platform 100 may provide for placement optimization of transmitting RF location nodes 150A-C for service plan navigation, such as optimum placement of RF location nodes 150A-C prior to service plan execution.
- the robotic platform 100 comprising a removable service module may determine a location for each of a plurality of RF transmitting location nodes, and positioning the robotic platform 100 at a predetermined location wherein the predetermined location is located based, at least on part, on the one or more RF transmitting RF location nodes 150A- C.
- the robotic platform 100 may provide for service robot navigation through a service area location grid established through a plurality of transmitting RF location nodes 150A-C, such as for navigating through a grid established through RF location nodes 150A-C.
- the robotic platform 100 comprising a removable service module 102A-D may engage a propulsion mechanism 202 of the robotic platform 100 to position the robotic platform 100 at a plurality of predetermined locations within a service area 140A-B, where the predetermined positions are located based, at least on part, on a grid of established by a plurality of RF locator nodes 150A-C, and operating the removable service module 102A-D to service a generally planar surface proximate the robotic platform 100.
- the robotic platform 100 may provide for service robot navigation during external navigation source disruption, such as providing for robot platform navigation through dead reckoning when RF locator node signals are temporarily unavailable (e.g., dead spot).
- the robotic platform 100 comprising a removable service module 102A-D may engage a propulsion mechanism to position the robotic platform 100 at a plurality of predetermined locations using dead reckoning, and operating the removable service module 102A-D to service a generally planar surface proximate the robot.
- the robotic platform 100 may provide for coordination of a plurality of service robots operating in proximity to one another, such as where multiple coordinated service robots act together to execute a service plan 128.
- the robotic platform 100 comprising a removable service module 102A-D may engage a propulsion mechanism to position the robotic platform 100 at a predetermined location for operating the removable service module 102A-D to service a generally planar surface proximate the robotic platform 100 in accordance with a service plan 128, and receiving information from at least one other robotic platform 100 operating in accordance with the service plan to facilitate cooperative completion of the service plan.
- the robotic platform 100 may provide for shared modular tool utilization amongst a plurality of service robots, such as for when multiple robotic platforms are using the same service modules 102A-D.
- the robotic platform 100 comprising a removable service module 102A-D may engage a propulsion mechanism to position the robotic platform 100 at a predetermined location for operating the removable service module 102A-D to service a generally planar surface proximate the robot in accordance with a service plan 128, receive information from at least one other robotic platform 100 operating in accordance with the service plan 128 to facilitate cooperative completion of the service plan 128, and in response to the received information, providing the removable service module 102A-D to the other robotic platform 100.
- the robotic platform 100 may provide for service robot pose determination through service area location grid established through a plurality of RF transmitting RF location nodes 150A-C, such as for determining orientation of robot from a grid established through RF location nodes 150A-C.
- the robotic platform 100 comprising a removable service module 102A-D may engage a propulsion mechanism to position the robotic platform 100 at a plurality of predetermined location wherein the predetermined positions are located based, at least on part, on a grid of one or more transmitting RF location nodes, and operating the removable service module 102A-D to service a generally planar surface proximate the robotic platform 100.
- processors used herein is synonymous with the plural “processors” and the two terms may be used interchangeably unless context clearly indicates otherwise.
- the processor may be part of a server, client, network infrastructure, mobile computing platform, stationary computing platform, or other computing platform.
- a processor may be any kind of computational or processing device capable of executing program instructions, codes, binary instructions and the like.
- the processor may be or include a signal processor, digital processor, embedded processor, microprocessor or any variant such as a co-processor (math co-processor, graphic coprocessor, communication co-processor and the like) and the like that may directly or indirectly facilitate execution of program code or program instructions stored thereon.
- the processor may enable execution of multiple programs, threads, and codes. The threads may be executed simultaneously to enhance the performance of the processor and to facilitate simultaneous operations of the application.
- methods, program codes, program instructions and the like described herein may be implemented in one or more thread.
- the thread may spawn other threads that may have assigned priorities associated with them; the processor may execute these threads based on priority or any other order based on instructions provided in the program code.
- the processor may include memory that stores methods, codes, instructions and programs as described herein and elsewhere.
- the processor may access a storage medium through an interface that may store methods, codes, and instructions as described herein and elsewhere.
- the storage medium associated with the processor for storing methods, programs, codes, program instructions or other type of instructions capable of being executed by the computing or processing device may include but may not be limited to one or more of a CD-ROM, DVD, memory, hard disk, flash drive, RAM, ROM, cache and the like.
- a processor may include one or more cores that may enhance speed and performance of a multiprocessor.
- the process may be a dual core processor, quad core processors, other chip-level multiprocessor and the like that combine two or more independent cores (called a die).
- the methods and systems described herein may be deployed in part or in whole through a machine that executes computer software on a server, client, firewall, gateway, hub, router, or other such computer and/or networking hardware.
- the software program may be associated with a server that may include a file server, print server, domain server, Internet server, intranet server and other variants such as secondary server, host server, distributed server and the like.
- the server may include one or more of memories, processors, computer readable media, storage media, ports (physical and virtual), communication devices, and interfaces capable of accessing other servers, clients, machines, and devices through a wired or a wireless medium, and the like.
- the methods, programs or codes as described herein and elsewhere may be executed by the server.
- other devices required for execution of methods as described in this application may be considered as a part of the infrastructure associated with the server.
- the server may provide an interface to other devices including, without limitation, clients, other servers, printers, database servers, print servers, file servers, communication servers, distributed servers and the like. Additionally, this coupling and/or connection may facilitate remote execution of program across the network. The networking of some or all of these devices may facilitate parallel processing of a program or method at one or more location without deviating from the scope.
- any of the devices attached to the server through an interface may include at least one storage medium capable of storing methods, programs, code and/or instructions.
- a central repository may provide program instructions to be executed on different devices.
- the remote repository may act as a storage medium for program code, instructions, and programs.
- the software program may be associated with a client that may include a file client, print client, domain client, Internet client, intranet client and other variants such as secondary client, host client, distributed client and the like.
- the client may include one or more of memories, processors, computer readable media, storage media, ports (physical and virtual), communication devices, and interfaces capable of accessing other clients, servers, machines, and devices through a wired or a wireless medium, and the like.
- the methods, programs or codes as described herein and elsewhere may be executed by the client.
- other devices required for execution of methods as described in this application may be considered as a part of the infrastructure associated with the client.
- the client may provide an interface to other devices including, without limitation, servers, other clients, printers, database servers, print servers, file servers, communication servers, distributed servers and the like. Additionally, this coupling and/or connection may facilitate remote execution of program across the network. The networking of some or all of these devices may facilitate parallel processing of a program or method at one or more location without deviating from the scope.
- any of the devices attached to the client through an interface may include at least one storage medium capable of storing methods, programs, applications, code and/or instructions.
- a central repository may provide program instructions to be executed on different devices.
- the remote repository may act as a storage medium for program code, instructions, and programs.
- the methods and systems described herein may be deployed in part or in whole through network infrastructures.
- the network infrastructure may include elements such as computing devices, servers, routers, hubs, firewalls, clients, personal computers, communication devices, routing devices and other active and passive devices, modules and/or components as known in the art.
- the computing and/or non-computing device(s) associated with the network infrastructure may include, apart from other components, a storage medium such as flash memory, buffer, stack, RAM, ROM and the like.
- the processes, methods, program codes, instructions described herein and elsewhere may be executed by one or more of the network infrastructural elements.
- the methods, program codes, and instructions described herein and elsewhere may be implemented on a cellular network having multiple cells.
- the cellular network may either be frequency division multiple access (FDMA) network or code division multiple access (CDMA) network.
- FDMA frequency division multiple access
- CDMA code division multiple access
- the cellular network may include mobile devices, cell sites, base stations, repeaters, antennas, towers, and the like.
- the cell network may be a GSM, GPRS, 3G, 4G, LTE, EVDO, mesh, or other networks types.
- the methods, programs codes, and instructions described herein and elsewhere may be implemented on or through mobile devices.
- the mobile devices may include navigation devices, cell phones, mobile phones, mobile personal digital assistants, laptops, palmtops, netbooks, pagers, electronic books readers, music players and the like. These devices may include, apart from other components, a storage medium such as a flash memory, buffer, RAM, ROM and one or more computing devices.
- the computing devices associated with mobile devices may be enabled to execute program codes, methods, and instructions stored thereon. Alternatively, the mobile devices may be configured to execute instructions in collaboration with other devices.
- the mobile devices may communicate with base stations interfaced with servers and configured to execute program codes.
- the mobile devices may communicate on a peer to peer network, mesh network, or other
- the program code may be stored on the storage medium associated with the server and executed by a computing device embedded within the server.
- the base station may include a computing device and a storage medium.
- the storage device may store program codes and instructions executed by the computing devices associated with the base station.
- the computer software, program codes, and/or instructions may be stored and/or accessed on machine readable media that may include: computer components, devices, and recording media that retain digital data used for computing for some interval of time; semiconductor storage known as random access memory (RAM); mass storage typically for more permanent storage, such as optical discs, forms of magnetic storage like hard disks, tapes, drums, cards and other types; processor registers, cache memory, volatile memory, non- volatile memory; optical storage such as CD, DVD; removable media such as flash memory (e.g.
- RAM random access memory
- mass storage typically for more permanent storage, such as optical discs, forms of magnetic storage like hard disks, tapes, drums, cards and other types
- processor registers cache memory, volatile memory, non- volatile memory
- optical storage such as CD, DVD
- removable media such as flash memory (e.g.
- USB sticks or keys floppy disks, magnetic tape, paper tape, punch cards, standalone RAM disks, Zip drives, removable mass storage, off-line, and the like; other computer memory such as dynamic memory, static memory, read/write storage, mutable storage, read only, random access, sequential access, location addressable, file addressable, content addressable, network attached storage, storage area network, bar codes, magnetic ink, and the like.
- the methods and systems described herein may transform physical and/or or intangible items from one state to another.
- the methods and systems described herein may also transform data representing physical and/or intangible items from one state to another.
- the depicted elements and the functions thereof may be implemented on machines through computer executable media having a processor capable of executing program instructions stored thereon as a monolithic software structure, as standalone software modules, or as modules that employ external routines, code, services, and so forth, or any combination of these, and all such
- implementations may be within the scope of the present disclosure.
- machines may include, but may not be limited to, personal digital assistants, laptops, personal computers, mobile phones, other handheld computing devices, medical equipment, wired or wireless communication devices, transducers, chips, calculators, satellites, tablet PCs, electronic books, gadgets, electronic devices, devices having artificial intelligence, computing devices, networking equipment, servers, routers and the like.
- the elements depicted in the flow chart and block diagrams or any other logical component may be implemented on a machine capable of executing program instructions.
- the methods and/or processes described above, and steps thereof, may be realized in hardware, software or any combination of hardware and software suitable for a particular application.
- the hardware may include a general-purpose computer and/or dedicated computing device or specific computing device or particular aspect or component of a specific computing device.
- the processes may be realized in one or more
- microprocessors microcontrollers, embedded microcontrollers, programmable digital signal processors or other programmable device, along with internal and/or external memory.
- the processes may also, or instead, be embodied in an application specific integrated circuit, a programmable gate array, programmable array logic, or any other device or combination of devices that may be configured to process electronic signals. It may further be appreciated that one or more of the processes may be realized as a computer executable code capable of being executed on a machine-readable medium.
- the computer executable code may be created using a structured programming language such as C, an object oriented programming language such as C++, or any other high-level or low-level programming language (including assembly languages, hardware description languages, and database programming languages and technologies) that may be stored, compiled or interpreted to run on one of the above devices, as well as heterogeneous combinations of processors, processor architectures, or combinations of different hardware and software, or any other machine capable of executing program instructions.
- a structured programming language such as C
- an object oriented programming language such as C++
- any other high-level or low-level programming language including assembly languages, hardware description languages, and database programming languages and technologies
- each method described above and combinations thereof may be embodied in computer executable code that, when executing on one or more computing devices, performs the steps thereof.
- the methods may be embodied in systems that perform the steps thereof, and may be distributed across devices in a number of ways, or all of the functionality may be integrated into a dedicated, standalone device or other hardware.
- the means for performing the steps associated with the processes described above may include any of the hardware and/or software described above. All such permutations and combinations are intended to fall within the scope of the present disclosure.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662361557P | 2016-07-13 | 2016-07-13 | |
PCT/US2017/041495 WO2018013538A1 (en) | 2016-07-13 | 2017-07-11 | Apparatus and methods for providing a reconfigurable robotic platform |
Publications (2)
Publication Number | Publication Date |
---|---|
EP3484678A1 true EP3484678A1 (en) | 2019-05-22 |
EP3484678A4 EP3484678A4 (en) | 2020-02-19 |
Family
ID=60953365
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP17828284.4A Withdrawn EP3484678A4 (en) | 2016-07-13 | 2017-07-11 | Apparatus and methods for providing a reconfigurable robotic platform |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP3484678A4 (en) |
CA (1) | CA3030027A1 (en) |
WO (1) | WO2018013538A1 (en) |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10328573B2 (en) | 2015-01-06 | 2019-06-25 | Discovery Robotics | Robotic platform with teach-repeat mode |
US11400595B2 (en) | 2015-01-06 | 2022-08-02 | Nexus Robotics Llc | Robotic platform with area cleaning mode |
US10518407B2 (en) | 2015-01-06 | 2019-12-31 | Discovery Robotics | Apparatus and methods for providing a reconfigurable robotic platform |
US9722640B2 (en) | 2015-01-06 | 2017-08-01 | Discovery Robotics | Method and system for determining precise robotic position and orientation using near-simultaneous radio frequency measurements |
USD869108S1 (en) | 2016-07-14 | 2019-12-03 | Discovery Robotics | Robot comprising a service module |
DE102018008269A1 (en) * | 2018-04-13 | 2019-10-17 | Hako Gmbh | street sweeper |
WO2019203878A1 (en) * | 2018-04-20 | 2019-10-24 | Discovery Robotics | Apparatus and methods of a service robotic platform |
WO2020076610A1 (en) * | 2018-10-08 | 2020-04-16 | R-Go Robotics Ltd. | System and method for geometrical user interactions via three-dimensional mapping |
CN109571482B (en) | 2019-01-02 | 2021-08-27 | 京东方科技集团股份有限公司 | Cleaning robot path planning method, related system and readable storage medium |
KR102295824B1 (en) * | 2019-12-06 | 2021-08-31 | 엘지전자 주식회사 | Mapping method of Lawn Mower Robot. |
EP4080311A1 (en) * | 2021-04-23 | 2022-10-26 | Carnegie Robotics, LLC | A method of operating one or more robots |
CN113375664B (en) * | 2021-06-09 | 2023-09-01 | 成都信息工程大学 | Autonomous mobile device positioning method based on dynamic loading of point cloud map |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6338013B1 (en) * | 1999-03-19 | 2002-01-08 | Bryan John Ruffner | Multifunctional mobile appliance |
SE0004465D0 (en) * | 2000-12-04 | 2000-12-04 | Abb Ab | Robot system |
JP2005046926A (en) * | 2003-07-30 | 2005-02-24 | Toshiba Corp | Service robot system, main robot and follower robot |
KR100664053B1 (en) * | 2004-09-23 | 2007-01-03 | 엘지전자 주식회사 | Cleaning tool auto change system and method for robot cleaner |
KR100749579B1 (en) * | 2005-09-05 | 2007-08-16 | 삼성광주전자 주식회사 | Moving Robot having a plurality of changeable work module and Control Method for the same |
KR101059829B1 (en) * | 2009-01-29 | 2011-08-29 | (주)리눅스아이티 | Module replaceable robotic system |
ES2827192T3 (en) * | 2012-02-08 | 2021-05-20 | Omron Tateisi Electronics Co | Task management system for a fleet of autonomous mobile robots |
-
2017
- 2017-07-11 CA CA3030027A patent/CA3030027A1/en not_active Abandoned
- 2017-07-11 EP EP17828284.4A patent/EP3484678A4/en not_active Withdrawn
- 2017-07-11 WO PCT/US2017/041495 patent/WO2018013538A1/en unknown
Also Published As
Publication number | Publication date |
---|---|
EP3484678A4 (en) | 2020-02-19 |
WO2018013538A1 (en) | 2018-01-18 |
CA3030027A1 (en) | 2018-01-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10913148B2 (en) | Operational service plan disruption and return optimization for a service robot | |
US11400595B2 (en) | Robotic platform with area cleaning mode | |
US10328573B2 (en) | Robotic platform with teach-repeat mode | |
US20180364045A1 (en) | Robotic platform with mapping facility | |
US20200047337A1 (en) | Robotic platform with event based mode change | |
US20180361585A1 (en) | Robotic platform with multi-function service module | |
US20200047343A1 (en) | Remote planning and locally adaptive service mapping | |
EP3484678A1 (en) | Apparatus and methods for providing a reconfigurable robotic platform | |
US20180361584A1 (en) | Robotic platform with long-term learning | |
US20180361581A1 (en) | Robotic platform with following mode | |
US11576543B2 (en) | Robotic vacuum with rotating cleaning apparatus | |
US11927450B2 (en) | Methods for finding the perimeter of a place using observed coordinates | |
EP3785093B1 (en) | Robot contextualization of map regions | |
JP7259015B2 (en) | Mobile robot and its control method | |
EP3787458B1 (en) | A plurality of robot cleaners | |
US6580246B2 (en) | Robot touch shield | |
EP3552072B1 (en) | Robotic cleaning device with operating speed variation based on environment | |
WO2019203878A1 (en) | Apparatus and methods of a service robotic platform | |
WO2020086557A1 (en) | Apparatus and method for operations of a robotic platform | |
US11747819B1 (en) | Robotic fire extinguisher | |
US11947015B1 (en) | Efficient coverage planning of mobile robotic devices | |
US11989021B1 (en) | Method for overcoming obstructions of a robotic device | |
US20230123512A1 (en) | Robotic cleaning device with dynamic area coverage | |
US12111664B1 (en) | Versatile robotic device | |
KR102581196B1 (en) | Airport robot and computer readable recording medium of performing operating method of thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20190208 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: CALLEN, DAVID Inventor name: KUMAR, SUJITH, VIJAYA Inventor name: RAJENDRAN, VIVEK Inventor name: WILLIAMS, LARRY, J. Inventor name: SHAH, ISHIT Inventor name: ULM, BRITTA, KATHLEEN Inventor name: SHAH, HARDIK Inventor name: KADEM, DHARMATEJA Inventor name: KUMAR, NAMAN Inventor name: BLUM, JEFFREY Inventor name: BAL, MANOMIT |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: KUMAR, NAMAN Inventor name: KADEM, DHARMATEJA Inventor name: BLUM, JEFFREY Inventor name: CALLEN, DAVID Inventor name: SHAH, ISHIT Inventor name: BAL, MANOMIT Inventor name: KUMAR, SUJITH, VIJAYA Inventor name: SHAH, HARDIK Inventor name: RAJENDRAN, VIVEK Inventor name: ULM, BRITTA, KATHLEEN Inventor name: WILLIAMS, LARRY, J. |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
A4 | Supplementary search report drawn up and despatched |
Effective date: 20200120 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: B25J 9/16 20060101ALI20200114BHEP Ipc: G05D 1/02 20200101AFI20200114BHEP Ipc: B25J 19/02 20060101ALI20200114BHEP Ipc: B25J 11/00 20060101ALI20200114BHEP |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20200818 |