[go: nahoru, domu]

US20150199829A1 - Method and apparatus for creating structural drawing - Google Patents

Method and apparatus for creating structural drawing Download PDF

Info

Publication number
US20150199829A1
US20150199829A1 US14/586,139 US201414586139A US2015199829A1 US 20150199829 A1 US20150199829 A1 US 20150199829A1 US 201414586139 A US201414586139 A US 201414586139A US 2015199829 A1 US2015199829 A1 US 2015199829A1
Authority
US
United States
Prior art keywords
wall
image information
height
structural drawing
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US14/586,139
Other versions
US9607413B2 (en
Inventor
Dusan BAEK
Sunghoo Kim
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Assigned to SAMSUNG ELECTRONICS CO., LTD. reassignment SAMSUNG ELECTRONICS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BAEK, DUSAN, KIM, SUNGHOO
Publication of US20150199829A1 publication Critical patent/US20150199829A1/en
Application granted granted Critical
Publication of US9607413B2 publication Critical patent/US9607413B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/20Drawing from basic elements, e.g. lines or circles
    • G06F17/5004
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/13Architectural design, e.g. computer-aided architectural design [CAAD] related to design of buildings, bridges, landscapes, production plants or roads
    • G06K9/4604
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/536Depth or shape recovery from perspective effects, e.g. by using vanishing points
    • G06T7/602
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2111/00Details relating to CAD techniques
    • G06F2111/12Symbolic schematics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/21Indexing scheme for image data processing or generation, in general involving computational photography
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/04Architectural design, interior design
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/61Scene description
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • G06V10/247Aligning, centring, orientation detection or correction of the image by affine transforms, e.g. correction due to perspective effects; Quadrilaterals, e.g. trapezoids

Definitions

  • the present disclosure relates to a method and apparatus for creating structural drawings. More particularly, the present disclosure relates to a method and apparatus for creating structural drawings based on an image processing technology.
  • Recent electronic devices have been equipped with a technology for registering indoor structural drawings and controlling the indoor structural drawings.
  • Systems according to some related art have received plan views or perspective views of structural drawings from architects or architectural firms and created the images that may be applied to a variety of fields.
  • systems according to the related art are disadvantageous because such systems cannot adaptively create indoor structural drawings. If indoor structures are modified, systems according to some related art cannot apply the modified structures to services.
  • systems according to some related art can collectively process buildings constructed with the same structure, such as apartments, the like, or a combination thereof, by using the same plan views or structural drawings.
  • buildings constructed with the same structure are modified in such a way that the buildings have the modified structural drawings or buildings have been construction with different structure in such a way that the buildings have the structural drawings, systems according to some related art cannot provide services using structural drawings.
  • an aspect of the present disclosure is to provide a method and apparatus for creating structural drawings.
  • Another aspect of the present disclosure is to provide a method and apparatus for adaptively creating structural drawings based on image information.
  • Another aspect of the present disclosure is to provide a method and apparatus for creating individually structural drawing and setting or modifying a position for a device at a specific point.
  • a method for creating a structural drawing in an electronic device includes setting a reference height for at least one wall, receiving image information regarding at least one wall, generating vectors for the wall of the image information, and generating a structural drawing based on one or more of the generated vectors.
  • an electronic device for creating structural drawings includes an image input unit configured to generate image information for at least one wall, and a controller.
  • the controller is further configured to set a reference height for at least one wall.
  • the controller is further configured to receive the image information regarding at least one wall.
  • the controller is further configured to generate vectors for the wall of the image information and to generate a structural drawing based on one or more of the generated vectors.
  • FIG. 1 illustrates a schematic block diagram of an electronic device according to an embodiment of the present disclosure
  • FIG. 2 illustrates a flow chart that describes a method for creating a structural drawing according to an embodiment of the present disclosure
  • FIG. 3 illustrates a flow chart that describes a process for creating a structural drawing according to an embodiment of the present disclosure
  • FIG. 4 illustrates a diagram that describes a method for measuring a reference height according to an embodiment of the present disclosure
  • FIG. 5 illustrates diagrams that describe processes for extracting edge information according to an embodiment of the present disclosure
  • FIG. 6 illustrates a diagram that describes a method for extracting vectors according to an embodiment of the present disclosure
  • FIG. 7 illustrates diagrams that describe a structural drawing created according to an embodiment of the present disclosure.
  • the electronic device includes an image input system and processes image data.
  • the image input system may be a camera.
  • Examples of the electronic device are a smartphone, a portable terminal, a mobile terminal, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), a note pad, a Wireless Broadband (Wi-Bro) terminal, a tablet Personal Computer (PC), a mobile medical device, an electronic bracelet, an electronic necklace, an appcessory, a camera, a wearable device, an electronic clock, a wrist watch, a home appliance (e.g., refrigerator, air-conditioner, vacuum cleaner, oven, microwave oven, washing machine, air-cleaner, and/or the like), an artificial intelligent robot, an electronic dictionary, a camcorder, the like, or a combination thereof.
  • the electronic device may also be applied to various types of devices with an image input function.
  • FIG. 1 illustrates a schematic block diagram of an electronic device according to an embodiment of the present disclosure.
  • the electronic device 100 may include an image input unit 110 for receiving image information, a sensing unit 120 for sensing data used to create a structural drawing, a controller 130 for processing data to create a structural drawing by using the received image data and the detected data, and a display unit 140 for displaying the detected data and the structural drawing.
  • the image input unit 110 may include a camera module that can receive continuous scene or non-continuous scene.
  • the camera module is a device for capturing motion pictures and moving images and still images.
  • the camera module may include one or more image sensors (e.g., front lens or rear lens), image signal processor, a flash Light Emitting Diode (LED), the like, or a combination thereof.
  • image information acquired by the image input unit 110 may be information regarding a group of still images or moving images that can be processed in the format of panorama.
  • the sensing unit 120 may include at one of the following: a gesture sensor, gyro sensor, an atmospheric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a Red Green Blue (RGB) sensor, a bio sensor, a temperature/humidity sensor, a luminance sensor, and an Ultraviolet (UV) sensor.
  • the sensing unit 120 measures physical quantity, senses operation states of the electronic device, and converts the measured or sensed signals into electrical signals.
  • the sensing unit 120 may be an acceleration sensor.
  • the sensing unit 120 may further include a geomagnetic sensor for sensing measurement information to create a structural drawing.
  • the measurement information may include information related to acceleration, angle, relative direction, absolute direction, the like, or a combination thereof.
  • the controller 130 may control the operation of the electronic device 100 (e.g., the entire operation of the electronic device 100 ).
  • the controller 130 may process the image information from the image input unit 110 and the measurement information from the sensing unit 120 .
  • the controller 130 may correct the received image information, extract effective data from the received image information (e.g., edges), or create a structural drawing based on the received data or extracted data.
  • the controller 130 may extract vectors from the image information and create a structural drawing based on the extracted vectors.
  • the controller 130 may show an image being captured and a structural drawing being generated.
  • the controller 130 may also register a user's requesting information in a structural drawing. For example, the controller 130 may register a position of an appliance that the user specified at a specific point of a structural drawing.
  • the controller 130 may set a reference height for at least one wall, receive image information regarding at least one wall, extract vectors regarding the wall corresponding to the image information, and create a structural drawing based on the extracted vectors.
  • the controller 130 may extract a reference direction, a photographing direction, and lengths of segments of at least one wall, and create vectors based on the extractions.
  • the controller 130 may determine whether the currently receiving image information is image information regarding a reference wall to create a structural drawing. When the controller 130 ascertains that the currently receiving image information is image information regarding a reference wall to create a structural drawing, the controller 130 may create a structural drawing by combining vectors that has been extracted with each other. In contrast, when the controller 130 ascertains that the currently receiving image information is not image information regarding a reference wall to create a structural drawing, the controller 130 may store created vectors and extract vectors regarding a new wall.
  • the controller 130 may set a photographing height of a camera to the electronic device 100 and a reference height from the photographing height.
  • the controller 130 may calculate the photographing height based on an acceleration acquired by an acceleration sensor.
  • the controller 130 may set the photographing height by using at least one of the following: a method for using an angle between the top and bottom of a wall from the photographing height, a method for using pixels from the photographing height to the top of a wall and to the bottom of a wall, and a method for using a ratio of a distance from the photographing height to the top of a wall to a distance from the photographing height to the bottom top of a wall.
  • the controller 130 may create a structural drawing by combining vectors regarding at least two walls adjoined to each other.
  • the controller 130 may add position information regarding at least one object to the created structural drawing according to one or more user inputs.
  • the display unit 140 may display data required for the creation of a structural drawing and also the structural drawing.
  • the display unit 140 may display information that a user wants to register at a specific point in the created structural drawing. If the display unit 140 is implemented with a touch panel, the display unit 140 may also serve as an input system for detecting one or more user touches.
  • the electronic device 100 may set a reference height starting from the height of the camera, recognize a wall by processing image information from the camera, and make a track.
  • the electronic device 100 may create a structural drawing by moving along a wall using a User Interface (UI).
  • UI User Interface
  • the electronic device 100 may automatically add primary devices and places in the structural drawing by processing the images or particular point or a particular device's position that the user selects using a UI related to the structural drawing.
  • the electronic device 100 has been described above, explaining its components, it should be understood that the electronic device 100 is not limited to the components and may further include others.
  • FIG. 2 illustrates a flow chart that describes a method for creating a structural drawing according to an embodiment of the present disclosure.
  • the electronic device may start creating a structural drawing according to a user instruction.
  • the electronic device may receive image information using the image input unit at operation S 210 .
  • the image input unit may be a camera.
  • the image information may include information regarding at least one surface. If the camera of the electronic device captures images of a subject (e.g., the inside of a building), the image information may be information regarding at least one wall.
  • the following various embodiments of the present disclosure will be described based the acquisition of image information from walls inside a building and the creation of a structural drawing based on the acquired image information. It should be, however, understood that they are only illustrated as cases for creating an indoor structural drawing and should not limit the scope of the present disclosure.
  • the electronic device may extract vectors from the received image information at operation S 220 .
  • the method for acquiring vectors may include processes of detecting edge components from received image information, and extracting all vector components of edges, with respect to a photographing direction and a reference length, from the detected edge components. The method for extracting vectors will be described in detail later.
  • the electronic device may create a structural drawing by using the extracted vectors at operation S 230 .
  • the electronic device can create a structural drawing of a perspective view or a plan view by using vectors regarding at least one wall extracted at operation S 220 .
  • the structural drawing may be created by combining vectors regarding at least two walls with each other.
  • the structural drawing of a perspective view or a plan view may be created in such a way that: vectors regarding Wall 1 and vectors regarding Wall 2 adjoined to Wall 1 is extracted; and the vectors is combined each other.
  • the structural drawings of the interior of a building may be created by combining vectors regarding all the walls with each other in succession.
  • FIG. 3 illustrates a flow chart that describes a process for creating a structural drawing according to an embodiment of the present disclosure.
  • the electronic device may set a reference length for at least one wall at operation S 310 .
  • the reference length may be a reference height for a wall.
  • the reference height may be measured by image information.
  • the set reference height may serve as a reference value when extracting vectors.
  • the reference height may be measured in various methods (e.g., by using an angle at which image information is received, by using a ratio of pixels, the like, or a combination thereof).
  • the reference height may be set to any value.
  • a method for setting a photographing height of a camera for receiving image information is described as follows. Although the photographing height does not need to be retained during the creation of a structural drawing, the photographing height may be used to measure a reference height.
  • a photographing height of a camera may be set by an acceleration sensor.
  • a method for setting a photographing height by using an acceleration sensor is performed as follows. The electronic device moves from the bottom to a height so that the camera can capture images, and the displacement may be set to a photographing height. For example, while the electronic device is moving to the height to capture images by the camera, an instant acceleration detected by the acceleration sensor and a period of time for the displacement can be acquired and applied to the following Equation 1, thereby calculating the photographing height.
  • Equation 1 ‘a’ denotes acceleration, and ‘t’ denotes time.
  • the acceleration is 1 m/s 2 .
  • the reference height may be measured as 0.5 m.
  • the photographing height of a camera may be measured by integrating instant accelerations at time points respectively.
  • the method for setting a reference height is described as follows. A method for measuring a reference height by using an angle at which image information is received is described referring to FIG. 4 .
  • FIG. 4 illustrates a diagram that describes a method for measuring a reference height according to an embodiment of the present disclosure.
  • the electronic device 420 can capture an image of the wall 410 at the photographing height h b by the camera.
  • the photographing height h b may be measured by the method described above, using the following Equation 2.
  • Equation 2 h 1 denotes a length of the wall 410 from the bottom of the wall 210 to the position of the wall 210 corresponding to the photographing height h b , and h 2 denotes a length of the wall 210 from the position of the wall 210 corresponding to the photographing height h b to the top of the wall 410 .
  • H ⁇ 2 may be equal to h b .
  • Angles ⁇ 1 and ⁇ 2 may be measured by image information acquired by the electronic device.
  • the angles may be obtained by extracting edges from the received images.
  • ⁇ 1 may be measured by a slope sensor in such a way that the bottom line of the wall 410 in the received image is set to the center of the UI in the electronic device and the slope sensor acquires the slope.
  • ⁇ 2 may be measured by a slope sensor in such a way that the top line of the wall 410 in the received image is set to the center of the UI in the electronic device and the slope sensor acquires the slope.
  • the embodiment of the present disclosure measures the angles by using a slope sensor, it should be understood that the present disclosure is not limited to the embodiment of the present disclosure.
  • the shortest distance d w from the electronic device to the wall 410 may be measured by the following Equation 3.
  • the length of the wall 210 may be measured by the following Equation 4.
  • the reference height h ref may be measured by Equations 2 to 4.
  • the reference height h ref may be measured by a ratio of the upper portion of pixels to the lower portion of pixels.
  • the photographing direction of the camera can be set to be perpendicular to the wall, referring to the slope of the UI of the electronic device. Thereafter, the top edge and bottom edge of the wall may be extracted from the image taking the wall in the perpendicular direction. The lengths from the position of the wall corresponding to the photographing height to the top edge and the bottom edge may be measured, respectively.
  • the number of pixels from the position of the wall corresponding to the photographing height to the top edge (e.g., the number of top portion of pixels) and the bottom edge (e.g., the number of bottom portion of pixels) can be calculated by using lengths of the extracted top and bottom edges, respectively. Because the length from the position of the wall corresponding to the photographing height to the bottom edge of the wall is equal to the photographing height, the number of pixels can be set. In that case, the proportionality can be established by comparing the number of pixels from the position of the wall corresponding to the photographing height to the top of the wall with that from the position of the wall corresponding to the photographing height to the bottom of the wall, thereby measuring the reference height h ref .
  • the following processes may be performed. In that case, a correct length as a fixed number cannot be acquired because the reference height is not set. However, the same ratio for all the extracted edges can be acquired, and thus used to create a structural drawing.
  • the electronic device receives image information at operation S 320 .
  • the image information may be information regarding an image of at least one wall. Operations following S 320 may repeat according to the determination of operation S 350 . If the received image information is an image of a first wall to create a structural drawing, the wall may be set to a reference direction Vref. Vectors may be created by using information regarding photographing directions Vn of images additionally collected with respect to the reference direction Vref, and may be used to create a structural drawing.
  • the embodiment of the present disclosure measures the reference height href at operation S 310 , it should be understood that the present disclosure is not limited to the embodiment of the present disclosure. For example, the reference height href may be measured at operation S 320 .
  • the photographing direction is used to extract vectors.
  • the received image information is used to detect edges and to extract vectors in the following operations.
  • the electronic device detects edges from the received image information at operation S 330 .
  • Edges may correspond to the lines of intersections of walls, between side walls, between a side wall and a floor, and between a side wall and a ceiling.
  • Edges can be detected/extracted from image information in various methods (e.g., Sobel, Prewitt, Roberts, Compass, the second order differential and Laplacian, Canny, linear edge detection, the like, or a combination thereof). These edge detection methods will not be describe in the description.
  • the electronic device may extract vectors from the image information, based on the extracted edge information at operation S 340 .
  • FIG. 5 illustrates diagrams that describe processes for extracting edge information according to an embodiment of the present disclosure.
  • FIG. 6 illustrates a diagram that describes a method for extracting vectors according to an embodiment of the present disclosure.
  • the controller of the electronic device may extract information regarding edges of walls from received image information.
  • the wall includes line segments AC (d 1 ), CD (href), DF (d 2 ), and AF (Href).
  • the controller may extract edges of line segments AC, CD, DF, and AF, and recognize the wall.
  • the controller may also extract information regarding virtual points B and E based on information regarding the extracted edges.
  • the virtual point B is a point created by crossing a virtual line extended out from the line segment CD and a virtual line drawn from the point A perpendicular to the virtual line extending output from the segment CD.
  • the virtual point E is a point created by crossing a virtual line extended out from the line segment CD and a virtual line drawn from the point F perpendicular to the virtual line extending output from the segment CD.
  • the real distance H ref is measured at operation S 310 .
  • the real distance H ref is required to calculate a ratio of real distance d 1 to real distance d 2 or a ratio of a relative length to the reference height, in order to create vectors for walls.
  • the extracted information regarding one or more of each point is applied to the Pythagorean Theorem to extract lengths of the corresponding segments.
  • the controller may extract relative coordinates of points A, B, C, D, E and F.
  • the lengths of the line segments, Href, href, d 1 , d 2 , hp 1 , hp 2 , and hpd may be extracted.
  • ratios of the lengths of line segments may be obtained.
  • the ratio of length of line segment is compared with Href and href, the real distance of the wall may be extracted or the ratios of d 1 and d 2 to the reference height may be extracted.
  • the number of pixels of each line segment may be extracted based on pixel information regarding received image information.
  • a ratio of relative lengths of line segments or the real distance may be extracted based on the number of pixels. For example, a ratio of lengths to hpd, hp 1 , hp 2 , and href may be obtained by using the number of pixels.
  • the controller may obtain ⁇ 1 , ⁇ 2 , d 1 , and d 2 by the following equations.
  • ⁇ 1 tan - 1 ⁇ h p ⁇ ⁇ 1 h pd
  • ⁇ 2 tan - 1 ⁇ h p ⁇ ⁇ 2 h pd Equation ⁇ ⁇ 5
  • d 1 h pd cos ⁇ ⁇ ⁇ 1
  • d 2 h pd cos ⁇ ⁇ ⁇ 2 Equation ⁇ ⁇ 6
  • the real lengths of edges of walls or ratios of lengths of walls may be obtained from received image information.
  • the controller may extract information regarding edges of each wall based on a reference direction Vref and photographing directions Vn. For example, the controller may create vectors by combining photographing directions Vn with respect to the reference direction Vref with the length information regarding extracted edge information with respect to the reference direction Vref.
  • the controller may store the vectors regarding edges of each wall created by the method above in the storage.
  • the embodiment of the present disclosure may create a structural drawing by combining the vectors regarding each wall created as described above.
  • the controller of the electronic device determines whether image information currently received by the image input unit is at a beginning position to receive image information to create a structural drawing at operation S 350 . For example, when the electronic device re-receives image information regarding a first wall to create a structural drawing, the controller concludes that the electronic device reaches a begging position.
  • the controller may create a structural drawing based on the vectors regarding at least one wall that is created at operation S 360 .
  • the structural drawing may be a two or three dimensional structural drawing.
  • FIG. 7 illustrates diagrams that describe a structural drawing created based on an embodiment of the present disclosure.
  • the vectors regarding wall 1 may be information regarding a combination of the reference direction vref and the length information regarding wall 1 .
  • the vectors regarding wall 2 may be information regarding a combination of an angle or direction of wall 2 , with respect to the reference direction vref, and the length information regarding wall 2 .
  • part or a structural drawing regarding walls 1 and 2 may be created.
  • part or a structural drawing regarding walls 1 , 2 and 3 may be created.
  • the structural drawings regarding all the walls may be created.
  • information regarding heights of walls may be added to the second dimensional structural drawing, part of the three dimensional information may be displayed.
  • a three dimensional structural drawing may be created.
  • the controller may store extracted vectors regarding at least one wall in the storage at operation S 370 and returns to operation S 320 .
  • the controller may perform operations S 320 to S 350 based on newly received image information.
  • the controller may create a structural drawing based on the vectors regarding a created wall at operation S 360 .
  • the electronic device can recognize main devices and the places in a range of structural drawing by processing the images, and automatically add them to the drawing.
  • the electronic device can effectively add a main point and a position of a device in a structural drawing created according to a user's instructions.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Computer Hardware Design (AREA)
  • Mathematical Optimization (AREA)
  • General Engineering & Computer Science (AREA)
  • Structural Engineering (AREA)
  • Mathematical Analysis (AREA)
  • Civil Engineering (AREA)
  • Pure & Applied Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Computational Mathematics (AREA)
  • Architecture (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method and an apparatus for creating structural drawings and an electronic device adapted to the method are provided. The method includes setting a reference height for at least one wall, receiving image information regarding at least one wall, generating vectors for the wall of the image information, and generating a structural drawing based on one or more of the generated vectors.

Description

    CROSS-REFERENCE TO RELATED APPLICATION(S)
  • This application claims the benefit under 35 U.S.C. §119(a) of a Korean patent application filed on Jan. 10, 2014 in the Korean Intellectual Property Office and assigned Serial number 10-2014-0003269, the entire disclosure of which is hereby incorporated by reference.
  • TECHNICAL FIELD
  • The present disclosure relates to a method and apparatus for creating structural drawings. More particularly, the present disclosure relates to a method and apparatus for creating structural drawings based on an image processing technology.
  • BACKGROUND
  • Electronic devices have recently been developed to be equipped with more enhanced software or hardware to provide more convenient functions to users according to user needs. Recent mobile devices supporting a camera function have provided various services by using information regarding images acquired by the camera. Image processing technologies for recognizing and editing have also been developed. The application technologies using the processed images have also been developed.
  • Recent electronic devices have been equipped with a technology for registering indoor structural drawings and controlling the indoor structural drawings. Systems according to some related art have received plan views or perspective views of structural drawings from architects or architectural firms and created the images that may be applied to a variety of fields. However, such systems according to the related art are disadvantageous because such systems cannot adaptively create indoor structural drawings. If indoor structures are modified, systems according to some related art cannot apply the modified structures to services.
  • For example, systems according to some related art can collectively process buildings constructed with the same structure, such as apartments, the like, or a combination thereof, by using the same plan views or structural drawings. However, if buildings constructed with the same structure are modified in such a way that the buildings have the modified structural drawings or buildings have been construction with different structure in such a way that the buildings have the structural drawings, systems according to some related art cannot provide services using structural drawings.
  • The above information is presented as background information only to assist with an understanding of the present disclosure. No determination has been made, and no assertion is made, as to whether any of the above might be applicable as prior art with regard to the present disclosure.
  • SUMMARY
  • Aspects of the present disclosure are to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present disclosure is to provide a method and apparatus for creating structural drawings.
  • Another aspect of the present disclosure is to provide a method and apparatus for adaptively creating structural drawings based on image information.
  • Another aspect of the present disclosure is to provide a method and apparatus for creating individually structural drawing and setting or modifying a position for a device at a specific point.
  • In accordance with an aspect of the present disclosure, a method for creating a structural drawing in an electronic device is provided. The method includes setting a reference height for at least one wall, receiving image information regarding at least one wall, generating vectors for the wall of the image information, and generating a structural drawing based on one or more of the generated vectors.
  • In accordance with another aspect of the present disclosure, an electronic device for creating structural drawings is provided. The electronic device includes an image input unit configured to generate image information for at least one wall, and a controller. The controller is further configured to set a reference height for at least one wall. The controller is further configured to receive the image information regarding at least one wall. The controller is further configured to generate vectors for the wall of the image information and to generate a structural drawing based on one or more of the generated vectors.
  • Other aspects, advantages, and salient features of the disclosure will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses various embodiments of the present disclosure.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and other aspects, features, and advantages of certain embodiments of the present disclosure will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:
  • FIG. 1 illustrates a schematic block diagram of an electronic device according to an embodiment of the present disclosure;
  • FIG. 2 illustrates a flow chart that describes a method for creating a structural drawing according to an embodiment of the present disclosure;
  • FIG. 3 illustrates a flow chart that describes a process for creating a structural drawing according to an embodiment of the present disclosure;
  • FIG. 4 illustrates a diagram that describes a method for measuring a reference height according to an embodiment of the present disclosure;
  • FIG. 5 illustrates diagrams that describe processes for extracting edge information according to an embodiment of the present disclosure;
  • FIG. 6 illustrates a diagram that describes a method for extracting vectors according to an embodiment of the present disclosure; and
  • FIG. 7 illustrates diagrams that describe a structural drawing created according to an embodiment of the present disclosure.
  • Throughout the drawings, it should be noted that like reference numbers are used to depict the same or similar elements, features, and structures.
  • DETAILED DESCRIPTION
  • The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of various embodiments of the present disclosure as defined by the claims and their equivalents. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the various embodiments described herein can be made without departing from the scope and spirit of the present disclosure. In addition, descriptions of well-known functions and constructions may be omitted for clarity and conciseness.
  • The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the present disclosure. Accordingly, it should be apparent to those skilled in the art that the following description of various embodiments of the present disclosure is provided for illustration purpose only and not for the purpose of limiting the present disclosure as defined by the appended claims and their equivalents.
  • It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to “a component surface” includes reference to one or more of such surfaces.
  • The electronic device according to various embodiments of the present disclosure includes an image input system and processes image data. The image input system may be a camera. Examples of the electronic device are a smartphone, a portable terminal, a mobile terminal, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), a note pad, a Wireless Broadband (Wi-Bro) terminal, a tablet Personal Computer (PC), a mobile medical device, an electronic bracelet, an electronic necklace, an appcessory, a camera, a wearable device, an electronic clock, a wrist watch, a home appliance (e.g., refrigerator, air-conditioner, vacuum cleaner, oven, microwave oven, washing machine, air-cleaner, and/or the like), an artificial intelligent robot, an electronic dictionary, a camcorder, the like, or a combination thereof. The electronic device may also be applied to various types of devices with an image input function.
  • FIG. 1 illustrates a schematic block diagram of an electronic device according to an embodiment of the present disclosure.
  • Referring to FIG. 1, the electronic device 100 may include an image input unit 110 for receiving image information, a sensing unit 120 for sensing data used to create a structural drawing, a controller 130 for processing data to create a structural drawing by using the received image data and the detected data, and a display unit 140 for displaying the detected data and the structural drawing.
  • The image input unit 110 may include a camera module that can receive continuous scene or non-continuous scene. The camera module is a device for capturing motion pictures and moving images and still images. The camera module may include one or more image sensors (e.g., front lens or rear lens), image signal processor, a flash Light Emitting Diode (LED), the like, or a combination thereof. In an embodiment of the present disclosure, image information acquired by the image input unit 110 may be information regarding a group of still images or moving images that can be processed in the format of panorama.
  • The sensing unit 120 may include at one of the following: a gesture sensor, gyro sensor, an atmospheric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a Red Green Blue (RGB) sensor, a bio sensor, a temperature/humidity sensor, a luminance sensor, and an Ultraviolet (UV) sensor. The sensing unit 120 measures physical quantity, senses operation states of the electronic device, and converts the measured or sensed signals into electrical signals.
  • In an embodiment of the present disclosure, the sensing unit 120 may be an acceleration sensor. The sensing unit 120 may further include a geomagnetic sensor for sensing measurement information to create a structural drawing. The measurement information may include information related to acceleration, angle, relative direction, absolute direction, the like, or a combination thereof.
  • The controller 130 may control the operation of the electronic device 100 (e.g., the entire operation of the electronic device 100). The controller 130 may process the image information from the image input unit 110 and the measurement information from the sensing unit 120. The controller 130 may correct the received image information, extract effective data from the received image information (e.g., edges), or create a structural drawing based on the received data or extracted data. The controller 130 may extract vectors from the image information and create a structural drawing based on the extracted vectors. The controller 130 may show an image being captured and a structural drawing being generated. The controller 130 may also register a user's requesting information in a structural drawing. For example, the controller 130 may register a position of an appliance that the user specified at a specific point of a structural drawing.
  • In an embodiment of the present disclosure, the controller 130 may set a reference height for at least one wall, receive image information regarding at least one wall, extract vectors regarding the wall corresponding to the image information, and create a structural drawing based on the extracted vectors.
  • The controller 130 may extract a reference direction, a photographing direction, and lengths of segments of at least one wall, and create vectors based on the extractions.
  • The controller 130 may determine whether the currently receiving image information is image information regarding a reference wall to create a structural drawing. When the controller 130 ascertains that the currently receiving image information is image information regarding a reference wall to create a structural drawing, the controller 130 may create a structural drawing by combining vectors that has been extracted with each other. In contrast, when the controller 130 ascertains that the currently receiving image information is not image information regarding a reference wall to create a structural drawing, the controller 130 may store created vectors and extract vectors regarding a new wall.
  • The controller 130 may set a photographing height of a camera to the electronic device 100 and a reference height from the photographing height. The controller 130 may calculate the photographing height based on an acceleration acquired by an acceleration sensor.
  • The controller 130 may set the photographing height by using at least one of the following: a method for using an angle between the top and bottom of a wall from the photographing height, a method for using pixels from the photographing height to the top of a wall and to the bottom of a wall, and a method for using a ratio of a distance from the photographing height to the top of a wall to a distance from the photographing height to the bottom top of a wall.
  • The controller 130 may create a structural drawing by combining vectors regarding at least two walls adjoined to each other. The controller 130 may add position information regarding at least one object to the created structural drawing according to one or more user inputs.
  • The display unit 140 may display data required for the creation of a structural drawing and also the structural drawing. The display unit 140 may display information that a user wants to register at a specific point in the created structural drawing. If the display unit 140 is implemented with a touch panel, the display unit 140 may also serve as an input system for detecting one or more user touches.
  • In an embodiment of the present disclosure, the electronic device 100 may set a reference height starting from the height of the camera, recognize a wall by processing image information from the camera, and make a track. The electronic device 100 may create a structural drawing by moving along a wall using a User Interface (UI). During the process, the electronic device 100 may automatically add primary devices and places in the structural drawing by processing the images or particular point or a particular device's position that the user selects using a UI related to the structural drawing.
  • Although the electronic device 100 has been described above, explaining its components, it should be understood that the electronic device 100 is not limited to the components and may further include others.
  • FIG. 2 illustrates a flow chart that describes a method for creating a structural drawing according to an embodiment of the present disclosure.
  • Referring to FIG. 2, the electronic device may start creating a structural drawing according to a user instruction. The electronic device may receive image information using the image input unit at operation S210. The image input unit may be a camera. The image information may include information regarding at least one surface. If the camera of the electronic device captures images of a subject (e.g., the inside of a building), the image information may be information regarding at least one wall. The following various embodiments of the present disclosure will be described based the acquisition of image information from walls inside a building and the creation of a structural drawing based on the acquired image information. It should be, however, understood that they are only illustrated as cases for creating an indoor structural drawing and should not limit the scope of the present disclosure.
  • The electronic device may extract vectors from the received image information at operation S220. The method for acquiring vectors may include processes of detecting edge components from received image information, and extracting all vector components of edges, with respect to a photographing direction and a reference length, from the detected edge components. The method for extracting vectors will be described in detail later.
  • The electronic device may create a structural drawing by using the extracted vectors at operation S230. The electronic device can create a structural drawing of a perspective view or a plan view by using vectors regarding at least one wall extracted at operation S220. The structural drawing may be created by combining vectors regarding at least two walls with each other. For example, the structural drawing of a perspective view or a plan view may be created in such a way that: vectors regarding Wall 1 and vectors regarding Wall 2 adjoined to Wall 1 is extracted; and the vectors is combined each other. In addition, the structural drawings of the interior of a building may be created by combining vectors regarding all the walls with each other in succession.
  • FIG. 3 illustrates a flow chart that describes a process for creating a structural drawing according to an embodiment of the present disclosure.
  • Referring to FIG. 3, the electronic device may set a reference length for at least one wall at operation S310. The reference length may be a reference height for a wall. The reference height may be measured by image information. The set reference height may serve as a reference value when extracting vectors.
  • The reference height may be measured in various methods (e.g., by using an angle at which image information is received, by using a ratio of pixels, the like, or a combination thereof). The reference height may be set to any value.
  • A method for setting a photographing height of a camera for receiving image information is described as follows. Although the photographing height does not need to be retained during the creation of a structural drawing, the photographing height may be used to measure a reference height. For example, a photographing height of a camera may be set by an acceleration sensor. A method for setting a photographing height by using an acceleration sensor is performed as follows. The electronic device moves from the bottom to a height so that the camera can capture images, and the displacement may be set to a photographing height. For example, while the electronic device is moving to the height to capture images by the camera, an instant acceleration detected by the acceleration sensor and a period of time for the displacement can be acquired and applied to the following Equation 1, thereby calculating the photographing height.

  • h b=∫0 t(∫0 t adt) dt   Equation 1
  • In Equation 1, ‘a’ denotes acceleration, and ‘t’ denotes time. For example, when the electronic device is accelerated to a velocity of 1 m/s for 1 second, the acceleration is 1 m/s2. In that case, the reference height may be measured as 0.5 m. Because the motion of the electronic device may not be measured as a constant acceleration in real measurement, the photographing height of a camera may be measured by integrating instant accelerations at time points respectively. Although the embodiment of the present disclosure measures a photographing height of the electronic device by an acceleration sensor, it should be understood that the present disclosure is not limited to the embodiment of the present disclosure.
  • The method for setting a reference height is described as follows. A method for measuring a reference height by using an angle at which image information is received is described referring to FIG. 4.
  • FIG. 4 illustrates a diagram that describes a method for measuring a reference height according to an embodiment of the present disclosure.
  • Referring to FIG. 4, a method for obtaining a reference height href of a wall 410 is described as follows. The electronic device 420 can capture an image of the wall 410 at the photographing height hb by the camera. The photographing height hb may be measured by the method described above, using the following Equation 2.

  • h ref =h 1 +h 2  Equation 2
  • In Equation 2, h1 denotes a length of the wall 410 from the bottom of the wall 210 to the position of the wall 210 corresponding to the photographing height hb, and h2 denotes a length of the wall 210 from the position of the wall 210 corresponding to the photographing height hb to the top of the wall 410. H−2 may be equal to hb.
  • Angles θ1 and θ2 may be measured by image information acquired by the electronic device. For example, the angles may be obtained by extracting edges from the received images. θ1 may be measured by a slope sensor in such a way that the bottom line of the wall 410 in the received image is set to the center of the UI in the electronic device and the slope sensor acquires the slope. Similarly, θ2 may be measured by a slope sensor in such a way that the top line of the wall 410 in the received image is set to the center of the UI in the electronic device and the slope sensor acquires the slope. Although the embodiment of the present disclosure measures the angles by using a slope sensor, it should be understood that the present disclosure is not limited to the embodiment of the present disclosure.
  • The shortest distance dw from the electronic device to the wall 410 may be measured by the following Equation 3.
  • d w = h 1 tan θ 1 Equation 3
  • h2, the length of the wall 210, may be measured by the following Equation 4.

  • h 2 =d w*tan θ2  Equation 4
  • As described above, the reference height href may be measured by Equations 2 to 4.
  • In an embodiment of the present disclosure, the reference height href may be measured by a ratio of the upper portion of pixels to the lower portion of pixels. The photographing direction of the camera can be set to be perpendicular to the wall, referring to the slope of the UI of the electronic device. Thereafter, the top edge and bottom edge of the wall may be extracted from the image taking the wall in the perpendicular direction. The lengths from the position of the wall corresponding to the photographing height to the top edge and the bottom edge may be measured, respectively. The number of pixels from the position of the wall corresponding to the photographing height to the top edge (e.g., the number of top portion of pixels) and the bottom edge (e.g., the number of bottom portion of pixels) can be calculated by using lengths of the extracted top and bottom edges, respectively. Because the length from the position of the wall corresponding to the photographing height to the bottom edge of the wall is equal to the photographing height, the number of pixels can be set. In that case, the proportionality can be established by comparing the number of pixels from the position of the wall corresponding to the photographing height to the top of the wall with that from the position of the wall corresponding to the photographing height to the bottom of the wall, thereby measuring the reference height href.
  • In an embodiment of the present disclosure, after the reference height href is set to any value, the following processes may be performed. In that case, a correct length as a fixed number cannot be acquired because the reference height is not set. However, the same ratio for all the extracted edges can be acquired, and thus used to create a structural drawing.
  • Referring back to FIG. 3, the electronic device receives image information at operation S320. The image information may be information regarding an image of at least one wall. Operations following S320 may repeat according to the determination of operation S350. If the received image information is an image of a first wall to create a structural drawing, the wall may be set to a reference direction Vref. Vectors may be created by using information regarding photographing directions Vn of images additionally collected with respect to the reference direction Vref, and may be used to create a structural drawing. Although the embodiment of the present disclosure measures the reference height href at operation S310, it should be understood that the present disclosure is not limited to the embodiment of the present disclosure. For example, the reference height href may be measured at operation S320. The photographing direction is used to extract vectors. The received image information is used to detect edges and to extract vectors in the following operations.
  • The electronic device detects edges from the received image information at operation S330. Edges may correspond to the lines of intersections of walls, between side walls, between a side wall and a floor, and between a side wall and a ceiling. Edges can be detected/extracted from image information in various methods (e.g., Sobel, Prewitt, Roberts, Compass, the second order differential and Laplacian, Canny, linear edge detection, the like, or a combination thereof). These edge detection methods will not be describe in the description.
  • The electronic device may extract vectors from the image information, based on the extracted edge information at operation S340.
  • FIG. 5 illustrates diagrams that describe processes for extracting edge information according to an embodiment of the present disclosure.
  • The method for extracting vectors according to an embodiment of the present disclosure is described referring to FIG. 6.
  • FIG. 6 illustrates a diagram that describes a method for extracting vectors according to an embodiment of the present disclosure.
  • Referring to FIG. 6, the controller of the electronic device may extract information regarding edges of walls from received image information. The wall includes line segments AC (d1), CD (href), DF (d2), and AF (Href). The controller may extract edges of line segments AC, CD, DF, and AF, and recognize the wall. The controller may also extract information regarding virtual points B and E based on information regarding the extracted edges. The virtual point B is a point created by crossing a virtual line extended out from the line segment CD and a virtual line drawn from the point A perpendicular to the virtual line extending output from the segment CD. The virtual point E is a point created by crossing a virtual line extended out from the line segment CD and a virtual line drawn from the point F perpendicular to the virtual line extending output from the segment CD.
  • The real distance Href is measured at operation S310. The real distance Href is required to calculate a ratio of real distance d1 to real distance d2 or a ratio of a relative length to the reference height, in order to create vectors for walls.
  • To do this, after extracting information regarding each point, the extracted information regarding one or more of each point is applied to the Pythagorean Theorem to extract lengths of the corresponding segments. The controller may extract relative coordinates of points A, B, C, D, E and F. When the extracted coordinates are applied to the Pythagorean Theorem, the lengths of the line segments, Href, href, d1, d2, hp1, hp2, and hpd may be extracted. In addition, ratios of the lengths of line segments may be obtained. When the ratio of length of line segment is compared with Href and href, the real distance of the wall may be extracted or the ratios of d1 and d2 to the reference height may be extracted.
  • In an embodiment of the present disclosure, the number of pixels of each line segment may be extracted based on pixel information regarding received image information. A ratio of relative lengths of line segments or the real distance may be extracted based on the number of pixels. For example, a ratio of lengths to hpd, hp1, hp2, and href may be obtained by using the number of pixels.
  • The controller may obtain θ1, θ2, d1, and d2 by the following equations.
  • θ 1 = tan - 1 h p 1 h pd , θ 2 = tan - 1 h p 2 h pd Equation 5 d 1 = h pd cos θ 1 , d 2 = h pd cos θ 2 Equation 6
  • As described above, the real lengths of edges of walls or ratios of lengths of walls may be obtained from received image information.
  • The controller may extract information regarding edges of each wall based on a reference direction Vref and photographing directions Vn. For example, the controller may create vectors by combining photographing directions Vn with respect to the reference direction Vref with the length information regarding extracted edge information with respect to the reference direction Vref.
  • The controller may store the vectors regarding edges of each wall created by the method above in the storage. The embodiment of the present disclosure may create a structural drawing by combining the vectors regarding each wall created as described above.
  • Referring back to FIG. 3, the controller of the electronic device determines whether image information currently received by the image input unit is at a beginning position to receive image information to create a structural drawing at operation S350. For example, when the electronic device re-receives image information regarding a first wall to create a structural drawing, the controller concludes that the electronic device reaches a begging position.
  • When the controller ascertains that currently received image information is at a beginning position at operation S350, the controller may create a structural drawing based on the vectors regarding at least one wall that is created at operation S360. The structural drawing may be a two or three dimensional structural drawing.
  • FIG. 7 illustrates diagrams that describe a structural drawing created based on an embodiment of the present disclosure.
  • Referring to FIG. 7, a two-dimensional structural drawing according to an embodiment of the present disclosure is illustrated. The vectors regarding wall 1 may be information regarding a combination of the reference direction vref and the length information regarding wall 1. The vectors regarding wall 2 may be information regarding a combination of an angle or direction of wall 2, with respect to the reference direction vref, and the length information regarding wall 2. When combining vectors regarding wall 1 with vectors regarding wall 2, part or a structural drawing regarding walls 1 and 2 may be created. Similarly, when combining vectors regarding wall 2 with vectors regarding wall 3, part or a structural drawing regarding walls 1, 2 and 3 may be created. In addition, when combining vectors regarding wall 3 with vectors regarding the rest of walls, the structural drawings regarding all the walls may be created. When information regarding heights of walls may be added to the second dimensional structural drawing, part of the three dimensional information may be displayed. When information regarding heights of walls is combined with the second dimensional information, a three dimensional structural drawing may be created.
  • Referring back to FIG. 3, when the controller ascertains that currently received image information does not reach a beginning position at operation S350, the controller may store extracted vectors regarding at least one wall in the storage at operation S370 and returns to operation S320. The controller may perform operations S320 to S350 based on newly received image information. When the controller ascertains that currently received image information is at a beginning position at operation S350, the controller may create a structural drawing based on the vectors regarding a created wall at operation S360.
  • As described above, the method and apparatus according to the embodiments of the present disclosure can effectively create structural drawings.
  • The electronic device according to the embodiments of the present disclosure can recognize main devices and the places in a range of structural drawing by processing the images, and automatically add them to the drawing.
  • The electronic device can effectively add a main point and a position of a device in a structural drawing created according to a user's instructions.
  • While the present disclosure has been shown and described with reference to various embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present disclosure as defined by the appended claims and their equivalents.

Claims (21)

What is claimed is:
1. A method for creating a structural drawing in an electronic device, the method comprising:
setting a reference height for at least one wall;
receiving image information regarding at least one wall;
generating vectors for the wall of the image information; and
generating a structural drawing based on one or more of the generated vectors.
2. The method of claim 1, wherein the generating of the vectors comprises:
extracting reference direction information and photographing direction information;
extracting information regarding lengths of segments of at least one wall; and
generating the vectors based on the reference direction information, the photographing direction information, and the information regarding lengths of segments.
3. The method of claim 1, further comprising:
determining whether the currently received image information is image information of a reference wall to generate the structural drawing.
4. The method of claim 3, further comprising:
generating, in the event that the currently received image information corresponds to image information of the reference wall to generate the structural drawing, a corresponding structural drawing by combining vectors that have been generated with each other; and
storing, in the event that the currently received image information does not correspond to image information of the reference wall to generate the structural drawing, the created vectors and creating vectors for a new wall.
5. The method of claim 1, wherein the setting of the reference height comprises:
setting a photographing height at which the electronic device captures images; and
setting the reference height from the photographing height.
6. The method of claim 5, further comprising:
calculating the photographing height using an acceleration detected by an acceleration sensor.
7. The method of claim 5, wherein the reference height is set by at least one of the following:
using angles between the photographing height and a top of the at least one wall and between the photographing height and a bottom of the at least one wall;
using information regarding pixels between the photographing height and the top of the at least one wall and information regarding pixels between the photographing height and the bottom of the at least one wall; and
using a ratio of a distance from the photographing height to the top of the at least one wall to a distance from the photographing height to the bottom of the at least one wall.
8. The method of claim 1, wherein the generating of the structural drawing comprises:
combining vectors of at least two walls, adjoined, with each other.
9. The method of claim 1, further comprising:
adding position information regarding at least one object to the generated structural drawing according to one or more user inputs.
10. The method of claim 1, wherein the receiving of the image information regarding the at least one wall comprises:
capturing an image of the at least one wall using a camera operatively connected to the electronic device.
11. An electronic device for creating structural drawings, the electronic device comprising:
an image input unit configured to generate image information for at least one wall; and
a controller configured to set a reference height for at least one wall, receiving the image information regarding at least one wall, to generate vectors for the wall of the image information, and to generate the structural drawing based on one or more of the generated vectors.
12. The electronic device of claim 11, wherein the controller is further configured to extract reference direction information and photographing direction information, to extract information regarding lengths of segments of at least one wall, and to generate the vectors based on the reference direction information, the photographing direction information, and the information regarding lengths of segments.
13. The electronic device of claim 11, wherein the controller is further configured to determine whether the currently received image information is image information of a reference wall to create the structural drawing.
14. The electronic device of claim 13, wherein the controller is further configured to generate, in the event that the currently received image information corresponds to image information of a reference wall to generate a structural drawing, a corresponding structural drawing by combining vectors that have been created with each other, and to store, in the event that the currently received image information does not correspond to image information of the reference wall to generate the structural drawing, the created vectors and creates vectors for a new wall.
15. The electronic device of claim 11, wherein the controller is further configured to set a photographing height at which the electronic device captures images, and to set the reference height from the photographing height.
16. The electronic device of claim 15, further comprising:
an acceleration sensor for sensing accelerations,
wherein the controller calculates the photographing height using accelerations detected by the acceleration sensor.
17. The electronic device of claim 15, wherein the controller is further configured to set the photographing height by using at least one of the following:
angles between the photographing height and a top of the at least one wall and between the photographing height and a bottom of the at least one wall;
information regarding pixels between the photographing height and the top of the at least one wall and information regarding pixels between the photographing height and the bottom of the at least one wall; and
a ratio of a distance from the photographing height to the top of the at least one wall to a distance from the photographing height to the bottom of the at least one wall.
18. The electronic device of claim 11, wherein the controller is further configured to generate the structural drawing by combining vectors of at least two walls, adjoined, with each other.
19. The electronic device of claim 11, wherein the controller is further configured to add position information regarding at least one object to the created structural drawing according to one or more user inputs.
20. The electronic device of claim 11, further comprising:
a camera unit configured to capture one or more images;
wherein the controller is further configured to receive the image information regarding the at least one wall from using the camera unit.
21. A non-transitory computer-readable storage medium storing instructions that, when executed, cause one or more processors to perform the method of claim 1.
US14/586,139 2014-01-10 2014-12-30 Method and apparatus for creating structural drawing Expired - Fee Related US9607413B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020140003269A KR102127978B1 (en) 2014-01-10 2014-01-10 A method and an apparatus for generating structure
KR10-2014-0003269 2014-01-10

Publications (2)

Publication Number Publication Date
US20150199829A1 true US20150199829A1 (en) 2015-07-16
US9607413B2 US9607413B2 (en) 2017-03-28

Family

ID=53521821

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/586,139 Expired - Fee Related US9607413B2 (en) 2014-01-10 2014-12-30 Method and apparatus for creating structural drawing

Country Status (2)

Country Link
US (1) US9607413B2 (en)
KR (1) KR102127978B1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2138179A1 (en) * 1999-03-29 2009-12-30 Shire Canada Inc. Use of cytidine derivatives for the treatment of leukaemia

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040023190A1 (en) * 2002-03-12 2004-02-05 The Secretary Of State For The Department Of Transport, Local Government And The Regions Customised map specification
US20050041100A1 (en) * 1995-05-30 2005-02-24 Maguire Francis J. Apparatus for inducing attitudinal head movements for passive virtual reality
US20110205365A1 (en) * 2008-10-28 2011-08-25 Pasco Corporation Road measurement device and method for measuring road
US20120169868A1 (en) * 2010-12-31 2012-07-05 Kt Corporation Method and apparatus for measuring sizes of objects in image
US20140267422A1 (en) * 2013-03-12 2014-09-18 Honeywell International Inc. Aircraft flight deck displays and systems and methods for enhanced display of obstacles in a combined vision display
US20150187138A1 (en) * 2013-12-31 2015-07-02 Daqri, Llc Visualization of physical characteristics in augmented reality

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080021300A (en) * 2006-09-04 2008-03-07 연세대학교 산학협력단 Structure diagnostic system by lidar and diagnostic method
JP4905210B2 (en) * 2007-03-26 2012-03-28 富士通株式会社 Three-dimensional internal space model generation method, apparatus, and program
KR20100041926A (en) * 2008-10-15 2010-04-23 에스케이 텔레콤주식회사 System and method for location confirmation service, and method for creating location information thereof
KR101053405B1 (en) * 2009-01-29 2011-08-01 영남대학교 산학협력단 Structure Deformation Detection System and Method
JP5528018B2 (en) 2009-06-23 2014-06-25 キヤノン株式会社 Image processing apparatus, image processing apparatus control method, and program
CN103180883A (en) * 2010-10-07 2013-06-26 桑格威迪公司 Rapid 3d modeling
JP5836013B2 (en) 2011-08-26 2015-12-24 キヤノン株式会社 Image processing apparatus, control method thereof, and program
KR102078198B1 (en) * 2013-06-04 2020-04-07 삼성전자주식회사 Shooting Method for Three-Dimensional Modeling And Electrical Device Thereof

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050041100A1 (en) * 1995-05-30 2005-02-24 Maguire Francis J. Apparatus for inducing attitudinal head movements for passive virtual reality
US20040023190A1 (en) * 2002-03-12 2004-02-05 The Secretary Of State For The Department Of Transport, Local Government And The Regions Customised map specification
US20110205365A1 (en) * 2008-10-28 2011-08-25 Pasco Corporation Road measurement device and method for measuring road
US20120169868A1 (en) * 2010-12-31 2012-07-05 Kt Corporation Method and apparatus for measuring sizes of objects in image
US20140267422A1 (en) * 2013-03-12 2014-09-18 Honeywell International Inc. Aircraft flight deck displays and systems and methods for enhanced display of obstacles in a combined vision display
US20150187138A1 (en) * 2013-12-31 2015-07-02 Daqri, Llc Visualization of physical characteristics in augmented reality

Also Published As

Publication number Publication date
US9607413B2 (en) 2017-03-28
KR20150083573A (en) 2015-07-20
KR102127978B1 (en) 2020-06-29

Similar Documents

Publication Publication Date Title
US10467933B2 (en) Display device and image displaying method therefor
US9129157B2 (en) Method for image-based status determination
US9888215B2 (en) Indoor scene capture system
US9953618B2 (en) Using a plurality of sensors for mapping and localization
CN112005548B (en) Method of generating depth information and electronic device supporting the same
CN109840950B (en) Method for obtaining real-size 3D model and surveying device
US9280209B2 (en) Method for generating 3D coordinates and mobile terminal for generating 3D coordinates
CN104899361B (en) A kind of remote control method and device
WO2020244592A1 (en) Object pick and place detection system, method and apparatus
US9607413B2 (en) Method and apparatus for creating structural drawing
CN104769486B (en) Use the image processing system for polarizing poor video camera
CN106605188A (en) Information processing device, information processing method, and program
US9904355B2 (en) Display method, image capturing method and electronic device
TW201324436A (en) Method and system establishing 3D object
US20140043443A1 (en) Method and system for displaying content to have a fixed pose
US20160314370A1 (en) Method and apparatus for determination of object measurements based on measurement assumption of one or more common objects in an image
JP6559788B2 (en) Information provision device
JP6448413B2 (en) Roof slope estimation system and roof slope estimation method
US20220051018A1 (en) Method and device for detecting a vertical planar surface
CN104931039A (en) Free space positioning method and system
US20160011675A1 (en) Absolute Position 3D Pointing using Light Tracking and Relative Position Detection
JP6716897B2 (en) Operation detection method, operation detection device, and operation detection program
JP6597277B2 (en) Projection apparatus, projection method, and computer program for projection
TWI477969B (en) Remote control method, remote control system and electronic appararus
CN115393427A (en) Method and device for determining position and posture of camera, computer equipment and storage medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BAEK, DUSAN;KIM, SUNGHOO;REEL/FRAME:034602/0522

Effective date: 20141209

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20210328