[go: nahoru, domu]

US20160132992A1 - User interface scaling for devices based on display size - Google Patents

User interface scaling for devices based on display size Download PDF

Info

Publication number
US20160132992A1
US20160132992A1 US14/726,868 US201514726868A US2016132992A1 US 20160132992 A1 US20160132992 A1 US 20160132992A1 US 201514726868 A US201514726868 A US 201514726868A US 2016132992 A1 US2016132992 A1 US 2016132992A1
Authority
US
United States
Prior art keywords
display
processing device
class
display class
user interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/726,868
Inventor
Maya Rodrig
Darron Stepanich
Patrick Boyd
Alexandre Grigorovitch
Scott Walker
Vlad Riscutia
Julie Seto
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Technology Licensing LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Technology Licensing LLC filed Critical Microsoft Technology Licensing LLC
Priority to US14/726,868 priority Critical patent/US20160132992A1/en
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BOYD, PATRICK, GRIGOROVITCH, ALEXANDRE, RODRIG, MAYA, STEPANICH, DARRON, RISCUTIA, Vlad, WALKER, SCOTT, SETO, Julie
Publication of US20160132992A1 publication Critical patent/US20160132992A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/38Creation or generation of source code for implementing user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4092Image resolution transcoding, e.g. by using client-server architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/16Indexing scheme for image data processing or generation, in general involving adaptation to the client's capabilities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]

Definitions

  • PCs personal computers
  • slates slates
  • phones offer a wide range of screen sizes.
  • UI user interface
  • Non-limiting examples of the present disclosure describe user interface scaling based on a detected display size associated with a connected processing device.
  • a display size associated with a connected processing device is detected.
  • a display class is determined based on the detected display size.
  • a user interface for an application is launched on the connected processing device based on the determined display class.
  • a user interface is scaled based on connection of a processing device having a different display size from a first processing device.
  • a user interface for an application is launched at a first scaled model based on determining a display class associated with a first processing device.
  • the display class associated with the first processing device is detected based on a determined display size of the first processing device.
  • Connection of a second processing device is detected.
  • a display class associated with the second processing device is determined upon connection of the second processing device.
  • the display class associated with the second processing device is detected based on a determined display size of the first processing device
  • the user interface is adapted to display, on the second processing device, at a second scaled model designed for the second processing device upon determining that the display class of the second processing device is different from the display class of the first processing device.
  • FIG. 1 is a block diagram illustrating an example of a computing device with which aspects of the present disclosure may be practiced.
  • FIGS. 2A and 2B are simplified block diagrams of a mobile computing device with which aspects of the present disclosure may be practiced.
  • FIG. 3 is a simplified block diagram of a distributed computing system in which aspects of the present disclosure may be practiced.
  • FIG. 4 is an exemplary method for launching a user interface with which aspects of the present disclosure may be practiced.
  • FIG. 5 is an exemplary method for adapting a user interface with which aspects of the present disclosure may be practiced.
  • FIG. 6 is a diagram illustrating user interface scaling models with which aspects of the present disclosure may be practiced.
  • FIG. 7 is a diagram illustrating display for exemplary processing devices of different sizes with which aspects of the present disclosure may be practiced.
  • FIG. 8 is a diagram illustrating user interface examples with which aspects of the present disclosure may be practiced.
  • FIG. 9 is a diagram illustrating user interface examples with which aspects of the present disclosure may be practiced.
  • UI user interface
  • a user interface may be viewing an application on a device having a smaller display size (e.g., mobile phone) and proceed to connect the device having the small screen display to a device having a larger display size (e.g., PC).
  • Attempted resizing of an application across differing display sizes may drastically affect the display and operation of the UI for an application and/or application control.
  • systems are typically unable to recognize that a UI is to be scaled to a different programmed version to account for display size changes.
  • Other instances of building UI packages may incorporate a scaling model for large and small screen devices but are only able to show a single type of UI (e.g., phone version or slate version) once an application is installed. This may limit a user's ability to connect to large display screens and enjoy UI that takes advantage of available display space.
  • Examples of the present disclosure describe a hybrid approach for scaling UI that accommodates for changes in display size resulting from display size change and/or connection of devices having different screen sizes.
  • applications are developed that can execute/run on a plurality of devices having different display sizes.
  • Examples of a scalable UI of the present disclosure combine multiple UI scaling models that take into account physical screen size of a processing device, enabling the UI to adjust for available display space to accommodate changes in display sizes. For instance, a created application may run on a smart phone and upon detection of a device having a larger screen, the UI can adapt display for operation on the larger screen device.
  • Examples of the present disclosure comprise evaluation of display class information associated with an application UI at runtime of the application to identify a class of display (e.g., large screen/tablet/slate/phablet/phone, etc.).
  • Display class information may be used to determine whether to display a UI optimized for larger screen devices, smaller screen devices or something in-between.
  • a number of technical advantages are achieved based on the present disclosure including but not limited to: improved scalability of UI for applications, consistent UI displayed across varying display sizes, visually appealing presentation of application command control, enhanced processing capability across devices of varying display sizes including improved efficiency and usability for application command control, improved efficiency in navigation and access to control content, and improved user interaction with applications/application command controls, among other examples.
  • FIGS. 1-3 and the associated descriptions provide a discussion of a variety of operating environments in which examples of the invention may be practiced.
  • the devices and systems illustrated and discussed with respect to FIGS. 1-3 are for purposes of example and illustration and are not limiting of a vast number of computing device configurations that may be utilized for practicing examples of the invention, described herein.
  • FIG. 1 is a block diagram illustrating physical components of a computing device 102 , for example a mobile processing device, with which examples of the present disclosure may be practiced.
  • the computing device 102 may include at least one processing unit 104 and a system memory 106 .
  • the system memory 106 may comprise, but is not limited to, volatile storage (e.g., random access memory), non-volatile storage (e.g., read-only memory), flash memory, or any combination of such memories.
  • the system memory 106 may include an operating system 107 and one or more program modules 108 suitable for running software programs/modules 120 such as IO manager 124 , other utility 126 and application 128 .
  • system memory 106 may store instructions for execution. Other examples of system memory 106 may store data associated with applications.
  • the operating system 107 may be suitable for controlling the operation of the computing device 102 .
  • examples of the invention may be practiced in conjunction with a graphics library, other operating systems, or any other application program and is not limited to any particular application or system.
  • This basic configuration is illustrated in FIG. 1 by those components within a dashed line 122 .
  • the computing device 102 may have additional features or functionality.
  • the computing device 102 may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape. Such additional storage is illustrated in FIG. 1 by a removable storage device 109 and a non-removable storage device 110 .
  • program modules 108 may perform processes including, but not limited to, one or more of the stages of the operations described throughout this disclosure.
  • Other program modules may include electronic mail and contacts applications, word processing applications, spreadsheet applications, database applications, slide presentation applications, drawing or computer-aided application programs, photo editing applications, authoring applications, etc.
  • examples of the invention may be practiced in an electrical circuit comprising discrete electronic elements, packaged or integrated electronic chips containing logic gates, a circuit utilizing a microprocessor, or on a single chip containing electronic elements or microprocessors.
  • examples of the invention may be practiced via a system-on-a-chip (SOC) where each or many of the components illustrated in FIG. 1 may be integrated onto a single integrated circuit.
  • SOC system-on-a-chip
  • Such an SOC device may include one or more processing units, graphics units, communications units, system virtualization units and various application functionality all of which are integrated (or “burned”) onto the chip substrate as a single integrated circuit.
  • the functionality described herein may be operated via application-specific logic integrated with other components of the computing device 502 on the single integrated circuit (chip).
  • Examples of the present disclosure may also be practiced using other technologies capable of performing logical operations such as, for example, AND, OR, and NOT, including but not limited to mechanical, optical, fluidic, and quantum technologies.
  • examples of the invention may be practiced within a general purpose computer or in any other circuits or systems.
  • the computing device 102 may also have one or more input device(s) 112 such as a keyboard, a mouse, a pen, a sound input device, a device for voice input/recognition, a touch input device, etc.
  • the output device(s) 114 such as a display, speakers, a printer, etc. may also be included.
  • the aforementioned devices are examples and others may be used.
  • the computing device 104 may include one or more communication connections 116 allowing communications with other computing devices 118 . Examples of suitable communication connections 116 include, but are not limited to, RF transmitter, receiver, and/or transceiver circuitry; universal serial bus (USB), parallel, and/or serial ports.
  • Computer readable media may include computer storage media.
  • Computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, or program modules.
  • the system memory 106 , the removable storage device 109 , and the non-removable storage device 110 are all computer storage media examples (i.e., memory storage.)
  • Computer storage media may include RAM, ROM, electrically erasable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other article of manufacture which can be used to store information and which can be accessed by the computing device 102 . Any such computer storage media may be part of the computing device 102 .
  • Computer storage media does not include a carrier wave or other propagated or modulated data signal.
  • Communication media may be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media.
  • modulated data signal may describe a signal that has one or more characteristics set or changed in such a manner as to encode information in the signal.
  • communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media.
  • RF radio frequency
  • FIGS. 2A and 2B illustrate a mobile computing device 200 , for example, a mobile telephone, a smart phone, a personal data assistant, a tablet personal computer, a phablet, a slate, a laptop computer, and the like, with which examples of the invention may be practiced.
  • mobile computing device 200 may be implemented to execute applications and/or application command control.
  • Application command control relates to presentation and control of commands for use with an application through a user interface (UI) or graphical user interface (GUI).
  • UI user interface
  • GUI graphical user interface
  • application command controls may be programmed specifically to work with a single application. In other examples, application command controls may be programmed to work across more than one application.
  • FIG. 2A one example of a mobile computing device 200 for implementing the examples is illustrated.
  • the mobile computing device 200 is a handheld computer having both input elements and output elements.
  • the mobile computing device 200 typically includes a display 205 and one or more input buttons 210 that allow the user to enter information into the mobile computing device 200 .
  • the display 205 of the mobile computing device 200 may also function as an input device (e.g., a touch screen display). If included, an optional side input element 215 allows further user input.
  • the side input element 215 may be a rotary switch, a button, or any other type of manual input element.
  • mobile computing device 200 may incorporate more or less input elements.
  • the display 205 may not be a touch screen in some examples.
  • the mobile computing device 200 is a portable phone system, such as a cellular phone.
  • the mobile computing device 200 may also include an optional keypad 235 .
  • Optional keypad 235 may be a physical keypad or a “soft” keypad generated on the touch screen display or any other soft input panel (SIP).
  • the output elements include the display 205 for showing a GUI, a visual indicator 220 (e.g., a light emitting diode), and/or an audio transducer 225 (e.g., a speaker).
  • the mobile computing device 200 incorporates a vibration transducer for providing the user with tactile feedback.
  • the mobile computing device 200 incorporates input and/or output ports, such as an audio input (e.g., a microphone jack), an audio output (e.g., a headphone jack), and a video output (e.g., a HDMI port) for sending signals to or receiving signals from an external device.
  • an audio input e.g., a microphone jack
  • an audio output e.g., a headphone jack
  • a video output e.g., a HDMI port
  • FIG. 2B is a block diagram illustrating the architecture of one example of a mobile computing device. That is, the mobile computing device 200 can incorporate a system (i.e., an architecture) 202 to implement some examples.
  • the system 202 is implemented as a “smart phone” capable of running one or more applications (e.g., browser, e-mail, calendaring, contact managers, messaging clients, games, and media clients/players).
  • the system 202 is integrated as a computing device, such as an integrated personal digital assistant (PDA), tablet and wireless phone.
  • PDA personal digital assistant
  • One or more application programs 266 may be loaded into the memory 262 and run on or in association with the operating system 264 .
  • Examples of the application programs include phone dialer programs, e-mail programs, personal information management (PIM) programs, word processing programs, spreadsheet programs, Internet browser programs, messaging programs, and so forth.
  • the system 202 also includes a non-volatile storage area 268 within the memory 262 .
  • the non-volatile storage area 268 may be used to store persistent information that should not be lost if the system 202 is powered down.
  • the application programs 266 may use and store information in the non-volatile storage area 268 , such as e-mail or other messages used by an e-mail application, and the like.
  • a synchronization application (not shown) also resides on the system 202 and is programmed to interact with a corresponding synchronization application resident on a host computer to keep the information stored in the non-volatile storage area 268 synchronized with corresponding information stored at the host computer.
  • other applications may be loaded into the memory 262 and run on the mobile computing device 200 described herein.
  • the system 202 has a power supply 270 , which may be implemented as one or more batteries.
  • the power supply 270 might further include an external power source, such as an AC adapter or a powered docking cradle that supplements or recharges the batteries.
  • the system 202 may include peripheral device port 230 that performs the function of facilitating connectivity between system 202 and one or more peripheral devices. Transmissions to and from the peripheral device port 230 are conducted under control of the operating system (OS) 264 . In other words, communications received by the peripheral device port 230 may be disseminated to the application programs 266 via the operating system 264 , and vice versa.
  • OS operating system
  • the system 202 may also include a radio interface layer 272 that performs the function of transmitting and receiving radio frequency communications.
  • the radio interface layer 272 facilitates wireless connectivity between the system 202 and the “outside world,” via a communications carrier or service provider. Transmissions to and from the radio interface layer 272 are conducted under control of the operating system 264 . In other words, communications received by the radio interface layer 272 may be disseminated to the application programs 266 via the operating system 264 , and vice versa.
  • the visual indicator 220 may be used to provide visual notifications, and/or an audio interface 274 may be used for producing audible notifications via the audio transducer 225 .
  • the visual indicator 220 is a light emitting diode (LED) and the audio transducer 225 is a speaker.
  • LED light emitting diode
  • the LED may be programmed to remain on indefinitely until the user takes action to indicate the powered-on status of the device.
  • the audio interface 274 is used to provide audible signals to and receive audible signals from the user.
  • the audio interface 274 may also be coupled to a microphone to receive audible input, such as to facilitate a telephone conversation.
  • the microphone may also serve as an audio sensor to facilitate control of notifications, as will be described below.
  • the system 202 may further include a video interface 276 that enables an operation of an on-board camera 230 to record still images, video stream, and the like.
  • a mobile computing device 200 implementing the system 202 may have additional features or functionality.
  • the mobile computing device 200 may also include additional data storage devices (removable and/or non-removable) such as, magnetic disks, optical disks, or tape.
  • additional storage is illustrated in FIG. 2B by the non-volatile storage area 268 .
  • Data/information generated or captured by the mobile computing device 200 and stored via the system 202 may be stored locally on the mobile computing device 200 , as described above, or the data may be stored on any number of storage media that may be accessed by the device via the radio 272 or via a wired connection between the mobile computing device 200 and a separate computing device associated with the mobile computing device 200 , for example, a server computer in a distributed computing network, such as the Internet.
  • a server computer in a distributed computing network such as the Internet.
  • data/information may be accessed via the mobile computing device 200 via the radio 272 or via a distributed computing network.
  • data/information may be readily transferred between computing devices for storage and use according to well-known data/information transfer and storage means, including electronic mail and collaborative data/information sharing systems.
  • FIG. 3 illustrates one example of the architecture of a system for providing an application that reliably accesses target data on a storage system and handles communication failures to one or more client devices, as described above.
  • Target data accessed, interacted with, or edited in association with programming modules 108 , applications 120 , and storage/memory may be stored in different communication channels or other storage types.
  • various documents may be stored using a directory service 322 , a web portal 324 , a mailbox service 326 , an instant messaging store 328 , or a social networking site 330 , application 128 , IO manager 124 , other utility 126 , and storage systems may use any of these types of systems or the like for enabling data utilization, as described herein.
  • a server 320 may provide storage system for use by a client operating on general computing device 102 and mobile device(s) 200 through network 315 .
  • network 315 may comprise the Internet or any other type of local or wide area network
  • client nodes may be implemented as a computing device 102 embodied in a personal computer, a tablet computing device, and/or by a mobile computing device 200 (e.g., mobile processing device). Any of these examples of the client computing device 102 or 200 may obtain content from the store 316 .
  • FIG. 4 is an exemplary method 400 for launching a user interface with which aspects of the present disclosure may be practiced.
  • method 400 may be executed by an exemplary system such as shown in FIGS. 1-3 .
  • method 400 may be executed on a device comprising at least one processor configured to store and execute operations, programs or instructions.
  • method 400 is not limited to such examples.
  • method 400 may be executed (e.g., computer-implemented operations) by one or more components of a distributed network, for instance, web service/distributed network service (e.g. cloud service).
  • a distributed network for instance, web service/distributed network service (e.g. cloud service).
  • method 400 may be performed in associated with an application.
  • An application is a software component that executes on the processing device, interfacing with hardware and software components of the device.
  • An application comprises one or more programs designed to carry out operations and is associated with a UI.
  • an application may comprise a UI that is usable to control an application.
  • a UI may comprise an application command control.
  • An application command control is a graphical control element that interfaces with an application that executes on the processing device (e.g., memory, processor and functions of mobile device) and software components such as an operating system (OS), applications executing on a mobile device, programming modules, input methods (e.g., soft input panel (SIP)) and command container such as a pane or contextual menu, among other examples.
  • OS operating system
  • SIP soft input panel
  • an application command control is used to control execution of actions/commands for the application.
  • An SIP is an on-screen input method for devices (e.g., text input or voice input)
  • a pane is a software component that assists function of other software running on the device such as the OS and other software applications, among other examples.
  • an application command control may be integrated within an application. For instance, an application command control may be able to be launched, closed, expanded or minimized when an application is launched, closed, expanded or minimized.
  • an application command control is executable as its own application that interfaces with another application. For instance, an application command control may be able to be launched, closed or minimized separately from the launching of an application that is controlled by the application command control.
  • Method 400 begins at operation 402 where a display size associated with a processing device is detected.
  • a processing device may be any device comprising a display screen, at least one memory that is configured to store operations, programs, instructions, and at least one processor that is configured to execute the operations, programs or instructions such as an application/application command control.
  • Display size is a measurement of viewable area for display on a processing device. As an example, display size is a measurement associated with active viewable image size of a processing device. In other examples, display size may be associated with a nominal size value.
  • detecting of the display size comprises detecting a measurement value for screen diagonal of a display of a processing device.
  • detecting of the display size comprises detecting a display width (e.g.
  • Operation 402 may comprise a program instruction or module that can identify and evaluate system specifications for a processing device such as a mobile device.
  • the programming instruction implemented in operation 402 identifies a type or version of the processing device and executes a fetch of data to identify system information of the processing device.
  • a programming instruction or module may reference manufacturer specification information to determine a value associated with display size of a processing device.
  • Display size may be a measurement value associated with effective resolution of a display for a processing device. Measurement of effective resolution enables is an example of a value used to evaluate display form factors with a common metric, and enables UI scaling to be classified into different display classes.
  • any common metric relative to display size can be applied in exemplary method 400 .
  • processing device orientation e.g., portrait mode, touch mode, handwriting/ink mode, etc.
  • processing device operational mode e.g., keyboard mode, touch mode, handwriting/ink mode, etc.
  • window size e.g., screen aspect ratio, and screen effective resolution, among other examples.
  • a display class is determined based on the detected display size of a processing device.
  • Display class determination provides an abstraction for determining the size of a display.
  • a display class can be defined for processing devices having display sizes that fall within the range associated with the display class.
  • Code can query display class information to determine a UI instance to instantiate depending on the display size of the processing device that an application is running on. That is, display classes act as transition points for a UI experiences.
  • Display class is a value that is determined based a maximum display size. The value for display class may be in any form including numeric values and elements of speech, as examples.
  • display classes may be set based on numeric values. For example, a display class may be identified using numeric values (e.g., 0 to 3 inches).
  • display classes are used to classify processing devices in accordance with display size. For example, a display class may be set for processing devices having a display size falling in a range from 0 to 3 inches where another display class may be set for processing devices having a display size in a range from 3.1 to 5 inches, and so on.
  • a range for values of display classes may fall between 0 and infinity.
  • additional display class designations may be easily added without having to change operational code behavior.
  • display class designations including minimum and/or maximum values for ranges of display classes can be defined in any possible way that can be useful in defining user interface interaction.
  • a minimum value of a display class may be a value that is equal to or greater than a maximum value of a display class which is directly smaller than the display class being defined.
  • a first display class may correspond to a range for devices having displays between 0 and 3 inches and a minimum value of a second display class may take into account a maximum value of the first display class (e.g., 3 inches) and set the minimum value of the second display class at 3.1 inches, for instance.
  • Display classes may be changes over time based on programmer prerogative, analysis/testing/use cases, etc.
  • Operation 404 may comprise one or more programming operations for determining an active display class, and react to any changes in display class such as when a processing device of a different display size is connected, an application window changes to a different display or an effective resolution is changed on a processing device, among other examples.
  • an application programming interface (API) utilizing a shared library of data (e.g., dynamic link library (DLL) is used to determine a display class.
  • exemplary operational code associated with a display class determination is not limited to but may be similar to:
  • a UI is launched based on the determined display class.
  • a scaled model of a UI may be associated with a display class and operation 406 launches the UI scaled model that is associated with the determined display class. For example, if a determined display class is a class associated with small screen devices (e.g., display sizes less than 4 inches) then an application and application command control is launched that is adapted for small screen devices.
  • a UI scaled model that is launched (operation 406 ) is displayed on the processing device.
  • multiple processing devices may be connected, for example, where a mobile phone is connected to a personal computer or a laptop is connected to a docking station with large screen display(s), among other examples.
  • connecting of multiple devices may result in displaying of a scaled UI model on one processing device (e.g., where a laptop connected to a docking station with a larger screen displays a scaled UI adapted for the larger screen).
  • a user interface may be displayed on a first processing device (e.g., mobile phone) at a first scaled model designed for the determined display class of the first processing device and the user interface may be displayed on a second processing device (e.g., personal computer) at a second scaled model designed for the determined display class of the second processing device.
  • a first processing device e.g., mobile phone
  • a second processing device e.g., personal computer
  • FIG. 5 is an exemplary method 500 for adapting a user interface of an application with which aspects of the present disclosure may be practiced.
  • method 500 may be executed by an exemplary system such as shown in FIGS. 1-3 .
  • method 500 may be executed on a device comprising at least one processor configured to store and execute operations, programs or instructions.
  • method 500 is not limited to such examples.
  • method 500 may be executed (e.g., computer-implemented operations) by one or more components of a distributed network, for instance, web service/distributed network service (e.g. cloud service).
  • method 500 may be performed in associated with an application and/or application command control.
  • Method 500 begins at decision operation 502 where it is determined whether a display size change is detected or connection of another processing device is detected.
  • Decision operation 502 may comprise one or more programming operations for determining an active display class, and react to any changes in display class.
  • determination of a display size change may be detection of a changed display width of the first processing device such as when a change in resolution occurs.
  • connection of another processing device may be connecting a mobile phone to a large screen processing device.
  • the present disclosure is not limited to such examples.
  • operations may be executed on a processing device (e.g., in the background or while an application or user interface component is running) to detect potential changes in display class. If no display size change or connection of another device is detected, flow branches NO and processing of method 500 ends.
  • a processing device e.g., in the background or while an application or user interface component is running
  • Display class change events may be associated with exemplary operational code described above with respect to method 400 .
  • exemplary operation code used to evaluate display class changes for display class change events is not limited to but may be similar to:
  • a display class is determined based on a changed display size or a detected display size of another connected processing device. Detection of a display size and determination of a display class is described in detail in the description of FIG. 4 .
  • Flow proceeds to decision operation 508 where it is determined whether a display class is to be changed.
  • a display class is to be changed upon determining that a display size falls in a range that is different from the range associated with the current display class.
  • Processing registers a display class change event and compares a new display size against a previous display class. For instance, in the case where a changed display width is detected for a connected processing device, a detected display class associated with a previous display size is compared with the current display size to determine whether the display class has changed.
  • a detected display class associated with a first processing device is compared with detected current display size of a second processing device to determine whether the display class is to be changed. If evaluation determines that the display class is to remain the same, flow branches NO and processing of method 500 ends.
  • a user interface is adapted to display a scaled model associated with a changed display class.
  • a UI scaled model (e.g., scaled model of a user interface) may be associated with one or more display classes.
  • a UI scaled model may be associated with a UI model for small screen devices, where the UI model is applicable to multiple display classes (e.g., processing devices having a display size of less than 4 inches may include more than one display class).
  • Another UI scaled model may be associated with a UI model for large screen devices.
  • Flow proceeds back to operation 502 where method 500 may start again upon determining that a display size has changed or a new processing device is connected.
  • FIG. 6 is a diagram illustrating user interface scaling models with which aspects of the present disclosure may be practiced. Examples described herein provide a hybrid approach for scaling UI, combining adaptive scaling models to form a scaled UI model for a display class, effectively enabling a user interface to adjust to available display space. Examples of UI scaling models comprise but are not limited to: consistent UI ( 602 ), continuous scaling ( 604 ) and pivoting UI based on distinct inflection points ( 606 ). A display class may be programmed to include more than one scaling model of 602 - 606 to effectively develop a scaled model that is appropriate for a display class.
  • Scaling model 602 is a consistent UI scaling model where one or more components of a UI work the same across all display sizes (e.g., screen sizes). For instance, certain display aspects of a UI may visually appear the same to a user no matter the display class.
  • Some examples of components of a UI that may utilize a consistent UI ( 602 ) scaling model comprise but are not limited to, opening screens, loading screens, actions, etc.
  • Scaling model 604 is a continuous scaling model where one or more components of a UI adapt to available display size. For instance, components of a UI may appear differently depending on whether the display is a small screen display or a larger screen display. Some examples of components of a UI that may utilize a continuous scaling ( 604 ) model comprise but are not limited to, context menus, message bars, notification surfaces, etc.
  • Scaling model 606 is a pivoting UI scaling model where one or more components of a UI radically change once a display size threshold is reached.
  • components of a UI may be programmed differently depending on whether the display is a small screen display or a larger screen display.
  • Some examples of components of a UI that may utilize a pivoting UI ( 606 ) scaling model comprise but are not limited to, file menus, history, application command control, etc.
  • scaling models can be combined, for instance where a UI scaling model displayed may combine multiple scaling models such as scaling model 604 and scaling model 606 .
  • a transition (e.g., detected change in displays size/display class) between a UI instance of a processing device having a large display and a UI instance of a processing device having a smaller display may be changed in accordance with scaling model 606 (pivoting UI scaling model). While such a change may trigger scaling model 606 to be implemented, at the same time UI instances may also utilize a continuous scaling model (scaling model 604 ) to display UI elements appropriately to fit a display size of a processing device.
  • scaling model 606 pausing UI scaling model
  • two or more UI scaling models 602 - 606 are applied to a display class to enable programmers adaptively develop scaled UI model that exhibits a range of behaviors for each display class.
  • UI scaling models can be intelligently adapted to function best based on constraints (e.g., display size limitations) presented by some display classes.
  • FIG. 7 is a diagram illustrating display for exemplary processing devices of different sizes with which aspects of the present disclosure may be practiced. Examples shown in FIG. 7 comprise processing devices having varying sizes and/or varying screen/display sizes, for example processing device 702 , processing device 704 , processing device 706 and processing device 708 .
  • an application command control and an application/canvas are displayed in exemplary processing devices 702 - 708 .
  • An application command control and an application/canvas are examples of components of a UI with which the present disclosure may apply.
  • the UI is programmed to efficiently scale itself to utilize display space of processing devices of different sizes and/or operating size of display windows. For example, presentation of the application command control and/or application/canvas may vary across the different processing devices 702 - 708 .
  • An application command control and/or an application/canvas may be scaled according to a determined display class associated with a processing device.
  • An application/canvas is a portion of a display of a processing device that is designated for display of an application executing on the device.
  • the application/canvas region is the application UI that shows effects implemented by actions executed via an application command control. That is, the application/canvas is the content consisting of but not limited to the pages in workspace or editable portions of an application.
  • An application command control hosts a majority of an application's command set, organized in a hierarchical structure of individual palettes, chunks, and commands. Further, application command control may be programed to dynamically interact with an application and display simultaneously with applications and/or user interface components such as a soft input panel (SIP) or on screen keyboard. In one example, application command control may intelligently adapt based on content of an application (e.g., displayed or selected on an application canvas).
  • An application command control comprises a plurality of palettes (command palettes) programmed for application control.
  • a palette is a collection or associated grouping of actions or commands or chunks of commands that can be implemented by an application command control.
  • palettes of an application command control comprise top-level palettes and drill-in palettes.
  • Each of the top-level palettes and the drill-in palettes is a collection or grouping of rows comprising one or more selectable commands or command elements.
  • a top-level palette may comprise a highest level grouping of commands or functionalities and including commands that are more frequently used/more likely to be used by users.
  • a top-level palette may display command listings that can be drilled into and displayed in drill-in palettes.
  • FIG. 8 illustrates an exemplary top-level palette of an application command control.
  • a drill-in palette is a collection or grouping of commands that may be used less frequently/or likely to be used less frequently compared to the commands displayed on a top-level palette.
  • drill-in palettes host over-flow commands that, due to constraints resulting from a limited amount of display space for an application command control, are not included in a top-level palette.
  • FIG. 9 illustrates an exemplary drill-in palette of an application command control.
  • a top-level palette may comprise high-level commands or functionality for text editing, font editing, paragraph formatting, word finder, spell-check etc. that may be frequently called on by users.
  • a drill-in palette for a word processing application may comprise sub-elements of such high-level commands of the top-level palette, for example, subscript or superscript commands for a font command/function.
  • organization of palettes and commands may be editable, for example, where a command or given chunk of a palette can be pulled from one palette and added/displayed in another. For instance, an overflow command of a drill-in palette can be added to a top-level palette.
  • Command grouping data is information relating to the grouping of commands including associations between commands. For example, text editing features such as bolding, underlining, italicization, superscript and subscript may be associated and commonly used. Ideally, the application command control would like to include all of these commonly used functions on the same palette. However, due to limitations on the screen size, certain commands may need to be separated.
  • Command grouping data is information that identifies associations and what commands should or should not be separated from each other.
  • an application command control may determine that the maximum number of rows and commands allows displaying of text formatting commands including a superscript editing command in a top-level palette but would not also allow displaying of a subscript command.
  • the command grouping data it may be identified that from a functionality and/or usability standpoint, it is best not to separate the superscript and subscript editing commands. For instance, a user who makes a subscript text edit may later look to make a superscript edit or visa-versa.
  • programmers of the application command control may display a higher-level command for text editing in a top-level palette and the superscript and subscript editing commands may be included in a drill-in palette (child palette) of that top-level palette (parent palette) so they are not separated from each other.
  • Examples of common components that make up a top-level palette include but are not limited to: a palette bar and palette title, palette switching feature (including one touch target that launches palette switcher from title of palette bar), command to dismiss palette (e.g., visual representation of ellipses), quick commands (e.g., undo or redo), palette canvas comprising a plurality of commands, chunk commands (e.g., groupings of commands) and chunk dividers (e.g., dividing different groupings of commands), drill-in features to access drill-in palettes (when applicable).
  • palette switching feature including one touch target that launches palette switcher from title of palette bar
  • command to dismiss palette e.g., visual representation of ellipses
  • quick commands e.g., undo or redo
  • palette canvas comprising a plurality of commands
  • chunk commands e.g., groupings of commands
  • chunk dividers e.g., dividing different groupings of commands
  • Examples of common components that make up a drill-in palette can include but are not limited to: a palette bar and palette title, command to navigate back to the parent palette, command to dismiss palette (e.g., visual representation of ellipses), quick commands (e.g., undo or redo), palette canvas comprising a plurality of commands, chunk commands (e.g., groupings of commands) and chunk dividers (e.g., dividing different groupings of commands).
  • palettes of an application command control are presented in a vertical layout.
  • a top-level palette and a drill-in palette are vertically scrollable and comprise a collection of rows comprising one or more selectable command elements.
  • setting of the layout of a palette may also comprise presenting commands in a horizontal layout where commands are horizontally scrollable.
  • no limit is set on the scrollable height of a palette.
  • Scrolling position may be kept on top-level palettes when switching between top-level palettes however scrolling position may or may not be kept for drill-in palettes.
  • Commands set and displayed may include labels identifying a command and may be configured to take up an entire row of a palette. In other examples, multiple commands may be displayed in one row of a palette.
  • Scaling is applied to setting and displaying commands in palette rows.
  • commands may not have labels, for example, commands that are well known or have images displayed that are well known to users.
  • Separators or spacers may be displayed to break up different commands or chunks of commands.
  • application command control 802 is an exemplary top-level palette.
  • application command control 902 is an exemplary drill-in palette.
  • application command control 902 displays a drill-in palette of the top-level palette 802 shown in FIG. 8 , where top-level palette 802 is a parent palette of the drill-in palette 902 (e.g., child palette of the top-level palette).
  • a row showing a “font formatting” command includes a caret indicative of a drill-in feature.
  • a drill-in palette of application command control 902 is displayed on a display of a processing device.
  • font formatting command features “superscript” and “subscript” are displayed. In this way, application command control and/or an application/canvas may be scaled in accordance with a determined display class associated with a processing device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Non-limiting examples of the present disclosure describe adaptively scaling a user interface based on detection of a display size associated with a connected processing device. A display size associated with a connected processing device is detected. A display class is determined based on the detected display size. A user interface for an application is launched based on the determined display class. Other examples are also described.

Description

    PRIORITY
  • This application claims the benefit of U.S. Provisional Application No. 62/076,368, filed on Nov. 6, 2014, which is hereby incorporated by reference in its entirety.
  • BACKGROUND
  • Devices such as personal computers (PCs), laptops, slates, and phones offer a wide range of screen sizes. However, there is no established method for scaling a user interface (UI) across a large range of screen sizes, from very large displays down to smaller displays. It is with respect to this general technical area that the present application is directed.
  • SUMMARY
  • Non-limiting examples of the present disclosure describe user interface scaling based on a detected display size associated with a connected processing device. A display size associated with a connected processing device is detected. A display class is determined based on the detected display size. A user interface for an application is launched on the connected processing device based on the determined display class.
  • In other non-limiting examples, a user interface is scaled based on connection of a processing device having a different display size from a first processing device. A user interface for an application is launched at a first scaled model based on determining a display class associated with a first processing device. The display class associated with the first processing device is detected based on a determined display size of the first processing device. Connection of a second processing device is detected. A display class associated with the second processing device is determined upon connection of the second processing device. The display class associated with the second processing device is detected based on a determined display size of the first processing device The user interface is adapted to display, on the second processing device, at a second scaled model designed for the second processing device upon determining that the display class of the second processing device is different from the display class of the first processing device.
  • This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Additional aspects, features, and/or advantages of examples will be set forth in part in the description which follows and, in part, will be apparent from the description, or may be learned by practice of the disclosure.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Non-limiting and non-exhaustive examples are described with reference to the following figures.
  • FIG. 1 is a block diagram illustrating an example of a computing device with which aspects of the present disclosure may be practiced.
  • FIGS. 2A and 2B are simplified block diagrams of a mobile computing device with which aspects of the present disclosure may be practiced.
  • FIG. 3 is a simplified block diagram of a distributed computing system in which aspects of the present disclosure may be practiced.
  • FIG. 4 is an exemplary method for launching a user interface with which aspects of the present disclosure may be practiced.
  • FIG. 5 is an exemplary method for adapting a user interface with which aspects of the present disclosure may be practiced.
  • FIG. 6 is a diagram illustrating user interface scaling models with which aspects of the present disclosure may be practiced.
  • FIG. 7 is a diagram illustrating display for exemplary processing devices of different sizes with which aspects of the present disclosure may be practiced.
  • FIG. 8 is a diagram illustrating user interface examples with which aspects of the present disclosure may be practiced.
  • FIG. 9 is a diagram illustrating user interface examples with which aspects of the present disclosure may be practiced.
  • DETAILED DESCRIPTION
  • Users of processing devices desire applications to be optimized in a form-factor manner. However, there is no established method of scaling a user interface (UI) across a large range of screen sizes with such an approach. Simply attempting to merge a large screen version of an application with a small screen version of an application creates complications. As an example, when large screen versions of applications are executed on devices having smaller display sizes, the UI gets too crowded and touch targets become too small. Additionally, another complication is that UIs are not traditionally scalable across devices having different display sizes. For instance, a user of a processing device may be viewing an application on a device having a smaller display size (e.g., mobile phone) and proceed to connect the device having the small screen display to a device having a larger display size (e.g., PC). Attempted resizing of an application across differing display sizes may drastically affect the display and operation of the UI for an application and/or application control. In other cases where different versions of an application are developed (e.g., mobile version and desktop version), systems are typically unable to recognize that a UI is to be scaled to a different programmed version to account for display size changes. Other instances of building UI packages may incorporate a scaling model for large and small screen devices but are only able to show a single type of UI (e.g., phone version or slate version) once an application is installed. This may limit a user's ability to connect to large display screens and enjoy UI that takes advantage of available display space.
  • Examples of the present disclosure describe a hybrid approach for scaling UI that accommodates for changes in display size resulting from display size change and/or connection of devices having different screen sizes. In examples, applications are developed that can execute/run on a plurality of devices having different display sizes. Examples of a scalable UI of the present disclosure combine multiple UI scaling models that take into account physical screen size of a processing device, enabling the UI to adjust for available display space to accommodate changes in display sizes. For instance, a created application may run on a smart phone and upon detection of a device having a larger screen, the UI can adapt display for operation on the larger screen device. Examples of the present disclosure comprise evaluation of display class information associated with an application UI at runtime of the application to identify a class of display (e.g., large screen/tablet/slate/phablet/phone, etc.). Display class information may be used to determine whether to display a UI optimized for larger screen devices, smaller screen devices or something in-between.
  • A number of technical advantages are achieved based on the present disclosure including but not limited to: improved scalability of UI for applications, consistent UI displayed across varying display sizes, visually appealing presentation of application command control, enhanced processing capability across devices of varying display sizes including improved efficiency and usability for application command control, improved efficiency in navigation and access to control content, and improved user interaction with applications/application command controls, among other examples.
  • FIGS. 1-3 and the associated descriptions provide a discussion of a variety of operating environments in which examples of the invention may be practiced. However, the devices and systems illustrated and discussed with respect to FIGS. 1-3 are for purposes of example and illustration and are not limiting of a vast number of computing device configurations that may be utilized for practicing examples of the invention, described herein.
  • FIG. 1 is a block diagram illustrating physical components of a computing device 102, for example a mobile processing device, with which examples of the present disclosure may be practiced. In a basic configuration, the computing device 102 may include at least one processing unit 104 and a system memory 106. Depending on the configuration and type of computing device, the system memory 106 may comprise, but is not limited to, volatile storage (e.g., random access memory), non-volatile storage (e.g., read-only memory), flash memory, or any combination of such memories. The system memory 106 may include an operating system 107 and one or more program modules 108 suitable for running software programs/modules 120 such as IO manager 124, other utility 126 and application 128. As examples, system memory 106 may store instructions for execution. Other examples of system memory 106 may store data associated with applications. The operating system 107, for example, may be suitable for controlling the operation of the computing device 102. Furthermore, examples of the invention may be practiced in conjunction with a graphics library, other operating systems, or any other application program and is not limited to any particular application or system. This basic configuration is illustrated in FIG. 1 by those components within a dashed line 122. The computing device 102 may have additional features or functionality. For example, the computing device 102 may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape. Such additional storage is illustrated in FIG. 1 by a removable storage device 109 and a non-removable storage device 110.
  • As stated above, a number of program modules and data files may be stored in the system memory 106. While executing on the processing unit 104, program modules 108 (e.g., Input/Output (I/O) manager 124, other utility 126 and application 128) may perform processes including, but not limited to, one or more of the stages of the operations described throughout this disclosure. Other program modules that may be used in accordance with examples of the present invention may include electronic mail and contacts applications, word processing applications, spreadsheet applications, database applications, slide presentation applications, drawing or computer-aided application programs, photo editing applications, authoring applications, etc.
  • Furthermore, examples of the invention may be practiced in an electrical circuit comprising discrete electronic elements, packaged or integrated electronic chips containing logic gates, a circuit utilizing a microprocessor, or on a single chip containing electronic elements or microprocessors. For example, examples of the invention may be practiced via a system-on-a-chip (SOC) where each or many of the components illustrated in FIG. 1 may be integrated onto a single integrated circuit. Such an SOC device may include one or more processing units, graphics units, communications units, system virtualization units and various application functionality all of which are integrated (or “burned”) onto the chip substrate as a single integrated circuit. When operating via an SOC, the functionality described herein may be operated via application-specific logic integrated with other components of the computing device 502 on the single integrated circuit (chip). Examples of the present disclosure may also be practiced using other technologies capable of performing logical operations such as, for example, AND, OR, and NOT, including but not limited to mechanical, optical, fluidic, and quantum technologies. In addition, examples of the invention may be practiced within a general purpose computer or in any other circuits or systems.
  • The computing device 102 may also have one or more input device(s) 112 such as a keyboard, a mouse, a pen, a sound input device, a device for voice input/recognition, a touch input device, etc. The output device(s) 114 such as a display, speakers, a printer, etc. may also be included. The aforementioned devices are examples and others may be used. The computing device 104 may include one or more communication connections 116 allowing communications with other computing devices 118. Examples of suitable communication connections 116 include, but are not limited to, RF transmitter, receiver, and/or transceiver circuitry; universal serial bus (USB), parallel, and/or serial ports.
  • The term computer readable media as used herein may include computer storage media. Computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, or program modules. The system memory 106, the removable storage device 109, and the non-removable storage device 110 are all computer storage media examples (i.e., memory storage.) Computer storage media may include RAM, ROM, electrically erasable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other article of manufacture which can be used to store information and which can be accessed by the computing device 102. Any such computer storage media may be part of the computing device 102. Computer storage media does not include a carrier wave or other propagated or modulated data signal.
  • Communication media may be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media. The term “modulated data signal” may describe a signal that has one or more characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media.
  • FIGS. 2A and 2B illustrate a mobile computing device 200, for example, a mobile telephone, a smart phone, a personal data assistant, a tablet personal computer, a phablet, a slate, a laptop computer, and the like, with which examples of the invention may be practiced. For example, mobile computing device 200 may be implemented to execute applications and/or application command control. Application command control relates to presentation and control of commands for use with an application through a user interface (UI) or graphical user interface (GUI). In one example, application command controls may be programmed specifically to work with a single application. In other examples, application command controls may be programmed to work across more than one application. With reference to FIG. 2A, one example of a mobile computing device 200 for implementing the examples is illustrated. In a basic configuration, the mobile computing device 200 is a handheld computer having both input elements and output elements. The mobile computing device 200 typically includes a display 205 and one or more input buttons 210 that allow the user to enter information into the mobile computing device 200. The display 205 of the mobile computing device 200 may also function as an input device (e.g., a touch screen display). If included, an optional side input element 215 allows further user input. The side input element 215 may be a rotary switch, a button, or any other type of manual input element. In alternative examples, mobile computing device 200 may incorporate more or less input elements. For example, the display 205 may not be a touch screen in some examples. In yet another alternative example, the mobile computing device 200 is a portable phone system, such as a cellular phone. The mobile computing device 200 may also include an optional keypad 235. Optional keypad 235 may be a physical keypad or a “soft” keypad generated on the touch screen display or any other soft input panel (SIP). In various examples, the output elements include the display 205 for showing a GUI, a visual indicator 220 (e.g., a light emitting diode), and/or an audio transducer 225 (e.g., a speaker). In some examples, the mobile computing device 200 incorporates a vibration transducer for providing the user with tactile feedback. In yet another example, the mobile computing device 200 incorporates input and/or output ports, such as an audio input (e.g., a microphone jack), an audio output (e.g., a headphone jack), and a video output (e.g., a HDMI port) for sending signals to or receiving signals from an external device.
  • FIG. 2B is a block diagram illustrating the architecture of one example of a mobile computing device. That is, the mobile computing device 200 can incorporate a system (i.e., an architecture) 202 to implement some examples. In one examples, the system 202 is implemented as a “smart phone” capable of running one or more applications (e.g., browser, e-mail, calendaring, contact managers, messaging clients, games, and media clients/players). In some examples, the system 202 is integrated as a computing device, such as an integrated personal digital assistant (PDA), tablet and wireless phone.
  • One or more application programs 266 may be loaded into the memory 262 and run on or in association with the operating system 264. Examples of the application programs include phone dialer programs, e-mail programs, personal information management (PIM) programs, word processing programs, spreadsheet programs, Internet browser programs, messaging programs, and so forth. The system 202 also includes a non-volatile storage area 268 within the memory 262. The non-volatile storage area 268 may be used to store persistent information that should not be lost if the system 202 is powered down. The application programs 266 may use and store information in the non-volatile storage area 268, such as e-mail or other messages used by an e-mail application, and the like. A synchronization application (not shown) also resides on the system 202 and is programmed to interact with a corresponding synchronization application resident on a host computer to keep the information stored in the non-volatile storage area 268 synchronized with corresponding information stored at the host computer. As should be appreciated, other applications may be loaded into the memory 262 and run on the mobile computing device 200 described herein.
  • The system 202 has a power supply 270, which may be implemented as one or more batteries. The power supply 270 might further include an external power source, such as an AC adapter or a powered docking cradle that supplements or recharges the batteries.
  • The system 202 may include peripheral device port 230 that performs the function of facilitating connectivity between system 202 and one or more peripheral devices. Transmissions to and from the peripheral device port 230 are conducted under control of the operating system (OS) 264. In other words, communications received by the peripheral device port 230 may be disseminated to the application programs 266 via the operating system 264, and vice versa.
  • The system 202 may also include a radio interface layer 272 that performs the function of transmitting and receiving radio frequency communications. The radio interface layer 272 facilitates wireless connectivity between the system 202 and the “outside world,” via a communications carrier or service provider. Transmissions to and from the radio interface layer 272 are conducted under control of the operating system 264. In other words, communications received by the radio interface layer 272 may be disseminated to the application programs 266 via the operating system 264, and vice versa.
  • The visual indicator 220 may be used to provide visual notifications, and/or an audio interface 274 may be used for producing audible notifications via the audio transducer 225. In the illustrated example, the visual indicator 220 is a light emitting diode (LED) and the audio transducer 225 is a speaker. These devices may be directly coupled to the power supply 270 so that when activated, they remain on for a duration dictated by the notification mechanism even though the processor 260 and other components might shut down for conserving battery power. The LED may be programmed to remain on indefinitely until the user takes action to indicate the powered-on status of the device. The audio interface 274 is used to provide audible signals to and receive audible signals from the user. For example, in addition to being coupled to the audio transducer 225, the audio interface 274 may also be coupled to a microphone to receive audible input, such as to facilitate a telephone conversation. In accordance with examples of the present invention, the microphone may also serve as an audio sensor to facilitate control of notifications, as will be described below. The system 202 may further include a video interface 276 that enables an operation of an on-board camera 230 to record still images, video stream, and the like.
  • A mobile computing device 200 implementing the system 202 may have additional features or functionality. For example, the mobile computing device 200 may also include additional data storage devices (removable and/or non-removable) such as, magnetic disks, optical disks, or tape. Such additional storage is illustrated in FIG. 2B by the non-volatile storage area 268.
  • Data/information generated or captured by the mobile computing device 200 and stored via the system 202 may be stored locally on the mobile computing device 200, as described above, or the data may be stored on any number of storage media that may be accessed by the device via the radio 272 or via a wired connection between the mobile computing device 200 and a separate computing device associated with the mobile computing device 200, for example, a server computer in a distributed computing network, such as the Internet. As should be appreciated such data/information may be accessed via the mobile computing device 200 via the radio 272 or via a distributed computing network. Similarly, such data/information may be readily transferred between computing devices for storage and use according to well-known data/information transfer and storage means, including electronic mail and collaborative data/information sharing systems.
  • FIG. 3 illustrates one example of the architecture of a system for providing an application that reliably accesses target data on a storage system and handles communication failures to one or more client devices, as described above. Target data accessed, interacted with, or edited in association with programming modules 108, applications 120, and storage/memory may be stored in different communication channels or other storage types. For example, various documents may be stored using a directory service 322, a web portal 324, a mailbox service 326, an instant messaging store 328, or a social networking site 330, application 128, IO manager 124, other utility 126, and storage systems may use any of these types of systems or the like for enabling data utilization, as described herein. A server 320 may provide storage system for use by a client operating on general computing device 102 and mobile device(s) 200 through network 315. By way of example, network 315 may comprise the Internet or any other type of local or wide area network, and client nodes may be implemented as a computing device 102 embodied in a personal computer, a tablet computing device, and/or by a mobile computing device 200 (e.g., mobile processing device). Any of these examples of the client computing device 102 or 200 may obtain content from the store 316.
  • FIG. 4 is an exemplary method 400 for launching a user interface with which aspects of the present disclosure may be practiced. As an example, method 400 may be executed by an exemplary system such as shown in FIGS. 1-3. In examples, method 400 may be executed on a device comprising at least one processor configured to store and execute operations, programs or instructions. However, method 400 is not limited to such examples. In at least one example, method 400 may be executed (e.g., computer-implemented operations) by one or more components of a distributed network, for instance, web service/distributed network service (e.g. cloud service).
  • In examples, method 400 may be performed in associated with an application. An application is a software component that executes on the processing device, interfacing with hardware and software components of the device. An application comprises one or more programs designed to carry out operations and is associated with a UI. In examples, an application may comprise a UI that is usable to control an application. In examples, a UI may comprise an application command control. An application command control is a graphical control element that interfaces with an application that executes on the processing device (e.g., memory, processor and functions of mobile device) and software components such as an operating system (OS), applications executing on a mobile device, programming modules, input methods (e.g., soft input panel (SIP)) and command container such as a pane or contextual menu, among other examples. As an example, an application command control is used to control execution of actions/commands for the application. An SIP is an on-screen input method for devices (e.g., text input or voice input), and a pane is a software component that assists function of other software running on the device such as the OS and other software applications, among other examples. In some examples, an application command control may be integrated within an application. For instance, an application command control may be able to be launched, closed, expanded or minimized when an application is launched, closed, expanded or minimized. In other examples, an application command control is executable as its own application that interfaces with another application. For instance, an application command control may be able to be launched, closed or minimized separately from the launching of an application that is controlled by the application command control.
  • Method 400 begins at operation 402 where a display size associated with a processing device is detected. A processing device may be any device comprising a display screen, at least one memory that is configured to store operations, programs, instructions, and at least one processor that is configured to execute the operations, programs or instructions such as an application/application command control. Display size is a measurement of viewable area for display on a processing device. As an example, display size is a measurement associated with active viewable image size of a processing device. In other examples, display size may be associated with a nominal size value. In one example, detecting of the display size comprises detecting a measurement value for screen diagonal of a display of a processing device. In another example, detecting of the display size comprises detecting a display width (e.g. width of the display for the processing device or operating size of a display window for an application executing on the processing device). Examples of a display size may comprise physical image size or logical image size, among other examples. Operation 402 may comprise a program instruction or module that can identify and evaluate system specifications for a processing device such as a mobile device. In one example, the programming instruction implemented in operation 402 identifies a type or version of the processing device and executes a fetch of data to identify system information of the processing device. In another example, a programming instruction or module may reference manufacturer specification information to determine a value associated with display size of a processing device.
  • Factors that may be evaluated to determine a display size include but are not limited to: dot density (e.g., dots per inch (DPI), pixel density (e.g., pixels per inch (PPI), physical size of a screen/display, screen diagonal of a display of a processing device, use case distance of a display from a user, display length, and display width, among other examples. As an example, display size may be a measurement value associated with effective resolution of a display for a processing device. Measurement of effective resolution enables is an example of a value used to evaluate display form factors with a common metric, and enables UI scaling to be classified into different display classes. However, one skilled in the art will recognize that any common metric relative to display size can be applied in exemplary method 400. In alternative examples, other factors other than display size may impact UI adaptation. Examples include but are not limited to: processing device orientation, processing device operational mode (e.g., keyboard mode, touch mode, handwriting/ink mode, etc.), window size, screen aspect ratio, and screen effective resolution, among other examples.
  • Flow proceeds to operation 404 where a display class is determined based on the detected display size of a processing device. Display class determination provides an abstraction for determining the size of a display. A display class can be defined for processing devices having display sizes that fall within the range associated with the display class. Code can query display class information to determine a UI instance to instantiate depending on the display size of the processing device that an application is running on. That is, display classes act as transition points for a UI experiences. Display class is a value that is determined based a maximum display size. The value for display class may be in any form including numeric values and elements of speech, as examples. For instance, display classes may be set to correspond with different types of processing devices (e.g., laptops, PCs, tablets, phones, etc.) where an exemplary display class may be “<=Phone” or “<=Tablet”. In another example, display classes may be set based on numeric values. For example, a display class may be identified using numeric values (e.g., 0 to 3 inches). In any examples, display classes are used to classify processing devices in accordance with display size. For example, a display class may be set for processing devices having a display size falling in a range from 0 to 3 inches where another display class may be set for processing devices having a display size in a range from 3.1 to 5 inches, and so on. A range for values of display classes may fall between 0 and infinity. In one example, operations for display class determination are written in style of successive less than or equal to (<=) checks, with an else for everything greater than a defined display class. In this example, additional display class designations may be easily added without having to change operational code behavior. However, one skilled in the art will recognize that display class designations including minimum and/or maximum values for ranges of display classes can be defined in any possible way that can be useful in defining user interface interaction. In examples, a minimum value of a display class may be a value that is equal to or greater than a maximum value of a display class which is directly smaller than the display class being defined. For instance, as in an example above, a first display class may correspond to a range for devices having displays between 0 and 3 inches and a minimum value of a second display class may take into account a maximum value of the first display class (e.g., 3 inches) and set the minimum value of the second display class at 3.1 inches, for instance. Display classes may be changes over time based on programmer prerogative, analysis/testing/use cases, etc.
  • Operation 404 may comprise one or more programming operations for determining an active display class, and react to any changes in display class such as when a processing device of a different display size is connected, an application window changes to a different display or an effective resolution is changed on a processing device, among other examples. In one example, an application programming interface (API) utilizing a shared library of data (e.g., dynamic link library (DLL) is used to determine a display class. As one example, exemplary operational code associated with a display class determination (e.g., display class event) is not limited to but may be similar to:
    • /** Interface to register against for display class change events */
    • struct IDisplayClasslnformation : public Mso::IRefCounted
    • {
    • public:
    • /** Returns the event store for display class change events. This event store will be invoked—Whenever the running application changes to a different display with a new display—Whenever the active display changes its DPI */
    • virtual DisplayClassChangedEvent& DisplayClassChanged( )=0;
    • virtual DisplayClass GetCurrentDisplayClass( )const=0;
    • };
    • /** Get a DisplayClasslnformation reference on the active UI thread */
    • MSOCPPAPI_(Mso::TCntPtr<Mso::DisplayClassInformation:JDisplayClassInformation>)
    • MakeDisplayClassInformation( );.
  • Once a display class is determined (operation 404), flow proceeds to operation 406 where a UI is launched based on the determined display class. A scaled model of a UI may be associated with a display class and operation 406 launches the UI scaled model that is associated with the determined display class. For example, if a determined display class is a class associated with small screen devices (e.g., display sizes less than 4 inches) then an application and application command control is launched that is adapted for small screen devices.
  • Flow proceeds to operation 408 where a UI scaled model that is launched (operation 406) is displayed on the processing device. In some examples, multiple processing devices may be connected, for example, where a mobile phone is connected to a personal computer or a laptop is connected to a docking station with large screen display(s), among other examples. In some examples, connecting of multiple devices may result in displaying of a scaled UI model on one processing device (e.g., where a laptop connected to a docking station with a larger screen displays a scaled UI adapted for the larger screen). In alternative examples, a user interface may be displayed on a first processing device (e.g., mobile phone) at a first scaled model designed for the determined display class of the first processing device and the user interface may be displayed on a second processing device (e.g., personal computer) at a second scaled model designed for the determined display class of the second processing device.
  • FIG. 5 is an exemplary method 500 for adapting a user interface of an application with which aspects of the present disclosure may be practiced. As an example, method 500 may be executed by an exemplary system such as shown in FIGS. 1-3. In examples, method 500 may be executed on a device comprising at least one processor configured to store and execute operations, programs or instructions. However, method 500 is not limited to such examples. In at least one example, method 500 may be executed (e.g., computer-implemented operations) by one or more components of a distributed network, for instance, web service/distributed network service (e.g. cloud service). In examples, method 500 may be performed in associated with an application and/or application command control.
  • In examples described herein, UI can be adapted to accommodate both large screen devices and small screen devices. Method 500 begins at decision operation 502 where it is determined whether a display size change is detected or connection of another processing device is detected. Decision operation 502 may comprise one or more programming operations for determining an active display class, and react to any changes in display class. As an example, determination of a display size change may be detection of a changed display width of the first processing device such as when a change in resolution occurs. As an example, connection of another processing device may be connecting a mobile phone to a large screen processing device. However, one skilled in the art will recognize that the present disclosure is not limited to such examples. In examples, operations (e.g., API) may be executed on a processing device (e.g., in the background or while an application or user interface component is running) to detect potential changes in display class. If no display size change or connection of another device is detected, flow branches NO and processing of method 500 ends.
  • If a display size change or connection of another device is detected, flow branches YES and proceeds to operation 504 where a display class change event is initiated. Display class change events may be associated with exemplary operational code described above with respect to method 400. In additional examples, exemplary operation code used to evaluate display class changes for display class change events is not limited to but may be similar to:
    • /** Callbacks to react to display class change events */
    • typedef std::function<void (const DisplayClass& oldDisplay, const DisplayClass&
    • newDisplay)>DisplayClassChangedCallback;
    • /** Events for display class changes */
    • typedef Mso::Async::EventSource<DisplayClassChangedCallback,
    • Mso::Async::VoidCallbackStoreWithCritSecTraits>DisplayClassChangedEvent;
    • /** Office-wide display classes */
    • enum class DisplayClass : uint32_t
    • {
    • SmallPhone=1,//reference device<4.3″, effective resolution reference Nokia 520 (x, y)
    • Phablet, //reference device <8″, effective resolution reference Nokia 1520 (x, y)
    • LargeDisplay, //reference device <27″, effective resolution reference monitor (x, y)
    • Infinite //Represents all devices larger than the reference devices listed
    • };.
  • At operation 506, a display class is determined based on a changed display size or a detected display size of another connected processing device. Detection of a display size and determination of a display class is described in detail in the description of FIG. 4. Flow proceeds to decision operation 508 where it is determined whether a display class is to be changed. A display class is to be changed upon determining that a display size falls in a range that is different from the range associated with the current display class. Processing registers a display class change event and compares a new display size against a previous display class. For instance, in the case where a changed display width is detected for a connected processing device, a detected display class associated with a previous display size is compared with the current display size to determine whether the display class has changed. In an example where additional processing devices are connected, a detected display class associated with a first processing device is compared with detected current display size of a second processing device to determine whether the display class is to be changed. If evaluation determines that the display class is to remain the same, flow branches NO and processing of method 500 ends.
  • If the display class is to be changed, flow branches YES and proceeds to operation 510 where the UI is adapted in accordance with the determined display class. For example, a user interface is adapted to display a scaled model associated with a changed display class. A UI scaled model (e.g., scaled model of a user interface) may be associated with one or more display classes. For instance, a UI scaled model may be associated with a UI model for small screen devices, where the UI model is applicable to multiple display classes (e.g., processing devices having a display size of less than 4 inches may include more than one display class). Another UI scaled model may be associated with a UI model for large screen devices.
  • Flow proceeds back to operation 502 where method 500 may start again upon determining that a display size has changed or a new processing device is connected.
  • FIG. 6 is a diagram illustrating user interface scaling models with which aspects of the present disclosure may be practiced. Examples described herein provide a hybrid approach for scaling UI, combining adaptive scaling models to form a scaled UI model for a display class, effectively enabling a user interface to adjust to available display space. Examples of UI scaling models comprise but are not limited to: consistent UI (602), continuous scaling (604) and pivoting UI based on distinct inflection points (606). A display class may be programmed to include more than one scaling model of 602-606 to effectively develop a scaled model that is appropriate for a display class.
  • Scaling model 602 is a consistent UI scaling model where one or more components of a UI work the same across all display sizes (e.g., screen sizes). For instance, certain display aspects of a UI may visually appear the same to a user no matter the display class. Some examples of components of a UI that may utilize a consistent UI (602) scaling model comprise but are not limited to, opening screens, loading screens, actions, etc.
  • Scaling model 604 is a continuous scaling model where one or more components of a UI adapt to available display size. For instance, components of a UI may appear differently depending on whether the display is a small screen display or a larger screen display. Some examples of components of a UI that may utilize a continuous scaling (604) model comprise but are not limited to, context menus, message bars, notification surfaces, etc.
  • Scaling model 606 is a pivoting UI scaling model where one or more components of a UI radically change once a display size threshold is reached. For instance, components of a UI may be programmed differently depending on whether the display is a small screen display or a larger screen display. Some examples of components of a UI that may utilize a pivoting UI (606) scaling model comprise but are not limited to, file menus, history, application command control, etc. In an example, scaling models can be combined, for instance where a UI scaling model displayed may combine multiple scaling models such as scaling model 604 and scaling model 606. As an example, a transition (e.g., detected change in displays size/display class) between a UI instance of a processing device having a large display and a UI instance of a processing device having a smaller display may be changed in accordance with scaling model 606 (pivoting UI scaling model). While such a change may trigger scaling model 606 to be implemented, at the same time UI instances may also utilize a continuous scaling model (scaling model 604) to display UI elements appropriately to fit a display size of a processing device.
  • In examples, two or more UI scaling models 602-606 are applied to a display class to enable programmers adaptively develop scaled UI model that exhibits a range of behaviors for each display class. In this way, UI scaling models can be intelligently adapted to function best based on constraints (e.g., display size limitations) presented by some display classes.
  • FIG. 7 is a diagram illustrating display for exemplary processing devices of different sizes with which aspects of the present disclosure may be practiced. Examples shown in FIG. 7 comprise processing devices having varying sizes and/or varying screen/display sizes, for example processing device 702, processing device 704, processing device 706 and processing device 708.
  • As shown in FIG. 7, an application command control and an application/canvas are displayed in exemplary processing devices 702-708. An application command control and an application/canvas are examples of components of a UI with which the present disclosure may apply. In examples, the UI is programmed to efficiently scale itself to utilize display space of processing devices of different sizes and/or operating size of display windows. For example, presentation of the application command control and/or application/canvas may vary across the different processing devices 702-708. An application command control and/or an application/canvas may be scaled according to a determined display class associated with a processing device.
  • An application/canvas is a portion of a display of a processing device that is designated for display of an application executing on the device. The application/canvas region is the application UI that shows effects implemented by actions executed via an application command control. That is, the application/canvas is the content consisting of but not limited to the pages in workspace or editable portions of an application.
  • An application command control hosts a majority of an application's command set, organized in a hierarchical structure of individual palettes, chunks, and commands. Further, application command control may be programed to dynamically interact with an application and display simultaneously with applications and/or user interface components such as a soft input panel (SIP) or on screen keyboard. In one example, application command control may intelligently adapt based on content of an application (e.g., displayed or selected on an application canvas). An application command control comprises a plurality of palettes (command palettes) programmed for application control. A palette is a collection or associated grouping of actions or commands or chunks of commands that can be implemented by an application command control. In one example, palettes of an application command control comprise top-level palettes and drill-in palettes. Each of the top-level palettes and the drill-in palettes is a collection or grouping of rows comprising one or more selectable commands or command elements. As an example, a top-level palette may comprise a highest level grouping of commands or functionalities and including commands that are more frequently used/more likely to be used by users. A top-level palette may display command listings that can be drilled into and displayed in drill-in palettes. FIG. 8 illustrates an exemplary top-level palette of an application command control. A drill-in palette is a collection or grouping of commands that may be used less frequently/or likely to be used less frequently compared to the commands displayed on a top-level palette. As an example, drill-in palettes host over-flow commands that, due to constraints resulting from a limited amount of display space for an application command control, are not included in a top-level palette. FIG. 9 illustrates an exemplary drill-in palette of an application command control. Using a word processing application as an exemplary application, a top-level palette may comprise high-level commands or functionality for text editing, font editing, paragraph formatting, word finder, spell-check etc. that may be frequently called on by users. As an example, a drill-in palette for a word processing application may comprise sub-elements of such high-level commands of the top-level palette, for example, subscript or superscript commands for a font command/function. In examples, organization of palettes and commands may be editable, for example, where a command or given chunk of a palette can be pulled from one palette and added/displayed in another. For instance, an overflow command of a drill-in palette can be added to a top-level palette.
  • Organization or grouping of commands in palettes may also be based on command grouping data available to programmers of an application command control. Command grouping data is information relating to the grouping of commands including associations between commands. For example, text editing features such as bolding, underlining, italicization, superscript and subscript may be associated and commonly used. Ideally, the application command control would like to include all of these commonly used functions on the same palette. However, due to limitations on the screen size, certain commands may need to be separated. Command grouping data is information that identifies associations and what commands should or should not be separated from each other. For example, an application command control may determine that the maximum number of rows and commands allows displaying of text formatting commands including a superscript editing command in a top-level palette but would not also allow displaying of a subscript command. Using the command grouping data, it may be identified that from a functionality and/or usability standpoint, it is best not to separate the superscript and subscript editing commands. For instance, a user who makes a subscript text edit may later look to make a superscript edit or visa-versa. Thus, in setting the layout of commands for palettes, programmers of the application command control may display a higher-level command for text editing in a top-level palette and the superscript and subscript editing commands may be included in a drill-in palette (child palette) of that top-level palette (parent palette) so they are not separated from each other.
  • Examples of common components that make up a top-level palette include but are not limited to: a palette bar and palette title, palette switching feature (including one touch target that launches palette switcher from title of palette bar), command to dismiss palette (e.g., visual representation of ellipses), quick commands (e.g., undo or redo), palette canvas comprising a plurality of commands, chunk commands (e.g., groupings of commands) and chunk dividers (e.g., dividing different groupings of commands), drill-in features to access drill-in palettes (when applicable).
  • Examples of common components that make up a drill-in palette can include but are not limited to: a palette bar and palette title, command to navigate back to the parent palette, command to dismiss palette (e.g., visual representation of ellipses), quick commands (e.g., undo or redo), palette canvas comprising a plurality of commands, chunk commands (e.g., groupings of commands) and chunk dividers (e.g., dividing different groupings of commands).
  • In one example, palettes of an application command control are presented in a vertical layout. For example, a top-level palette and a drill-in palette are vertically scrollable and comprise a collection of rows comprising one or more selectable command elements. However, in other examples, setting of the layout of a palette may also comprise presenting commands in a horizontal layout where commands are horizontally scrollable. In some examples, no limit is set on the scrollable height of a palette. Scrolling position may be kept on top-level palettes when switching between top-level palettes however scrolling position may or may not be kept for drill-in palettes. Commands set and displayed may include labels identifying a command and may be configured to take up an entire row of a palette. In other examples, multiple commands may be displayed in one row of a palette. Scaling is applied to setting and displaying commands in palette rows. In some other examples, commands may not have labels, for example, commands that are well known or have images displayed that are well known to users. Separators or spacers (either horizontal or vertical depending on layout of palette) may be displayed to break up different commands or chunks of commands.
  • In FIG. 8, application command control 802 is an exemplary top-level palette. In FIG. 9, application command control 902 is an exemplary drill-in palette. For example, application command control 902 displays a drill-in palette of the top-level palette 802 shown in FIG. 8, where top-level palette 802 is a parent palette of the drill-in palette 902 (e.g., child palette of the top-level palette). As shown in application command control 802, a row showing a “font formatting” command includes a caret indicative of a drill-in feature. When the drill-in feature is selected, a drill-in palette of application command control 902 is displayed on a display of a processing device. As can be seen in application command control 902, font formatting command features “superscript” and “subscript” are displayed. In this way, application command control and/or an application/canvas may be scaled in accordance with a determined display class associated with a processing device.
  • Reference has been made throughout this specification to “one example” or “an example,” meaning that a particular described feature, structure, or characteristic is included in at least one example. Thus, usage of such phrases may refer to more than just one example. Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more examples.
  • One skilled in the relevant art may recognize, however, that the examples may be practiced without one or more of the specific details, or with other methods, resources, materials, etc. In other instances, well known structures, resources, or operations have not been shown or described in detail merely to observe obscuring aspects of the examples.
  • While sample examples and applications have been illustrated and described, it is to be understood that the examples are not limited to the precise configuration and resources described above. Various modifications, changes, and variations apparent to those skilled in the art may be made in the arrangement, operation, and details of the methods and systems disclosed herein without departing from the scope of the claimed examples.

Claims (20)

What is claimed is:
1. A computer-implemented method comprising:
detecting a display size associated with a display of a first processing device;
determining a display class based on the detected display size; and
launching, on the first processing device, a user interface for an application based on the determined display class.
2. The computer-implemented method according to claim 1, wherein the display class is a range corresponding to display sizes of processing devices, and wherein the determining determines the display class from a plurality of display classes based on the display size of the first processing device being within the range.
3. The computer-implemented method according to claim 1, further comprising detecting a change to the display size of the first processing device or a connection of a second processing device.
4. The computer-implemented method according to claim 3, further comprising initiating a display class change event upon detecting the changed display size of the first processing device, determining a display class associated with the changed display size of the first processing device, and when the display class has changed, adapting the user interface to display a scaled model associated with the changed display size.
5. The computer-implemented method according to claim 3, further comprising initiating a display class change event upon detecting the connection of the second processing device.
6. The computer-implemented method according to claim 5, wherein the initiating of the display class change event further comprises determining a display class of the second processing device based on a detected display size of the second processing device.
7. The computer-implemented method according to claim 6, further comprising adapting the user interface to display, on the second processing device, a scaled model designed for the determined display class of the second processing device when the determined display class of the second processing device is different from the display class associated with the first processing device.
8. The computer-implemented method according to claim 6, further comprising displaying the user interface on the first connected processing device at first scaled model designed for the determined display class of the first processing device, and displaying the user interface on the second processing device at a second scaled model designed for the determined display class of the second processing device.
9. A system comprising:
a memory; and
at least one processor operatively connected with the memory, executing operations comprising:
detecting a display size associated with a display of a first processing device;
determining a display class based on the detected display size; and
launching, on the first processing device, a user interface for an application based on the determined display class.
10. The system according to claim 9, wherein the display class is a range corresponding to display sizes of processing devices, and wherein the determining determines the display class from a plurality of display classes based on the display size of the first processing device falling within the range.
11. The system according to claim 9, wherein the executed operations further comprising detecting a change to the display size of the first processing device or a connection of a second processing device.
12. The computer-implemented method according to claim 11, wherein the executed operations further comprising initiating a display class change event upon detecting the changed display size of the first processing device, determining a display class associated with the changed display size of the first processing device, and when the display class has changed, adapting the user interface to display a scaled model associated with the changed display size.
13. The system according to claim 11, wherein the executed operations further comprising initiating a display class change event upon detecting that the second processing device is connected.
14. The system according to claim 13, wherein the executed operations further comprising determining a display class of the second processing device based on a detected display size of the second processing device.
15. The system according to claim 13, wherein the executed operations further comprising adapting the user interface to display, on the second processing device, a scaled model designed for the determined display class of the second processing device when the determined display class of the second processing device is different from the display class associated with the first processing device.
16. The system according to claim 14, wherein the executed operations further comprising displaying the user interface on the first connected processing device at first scaled model designed for the determined display class of the first processing device, and displaying the user interface on the second processing device at a second scaled model designed for the determined display class of the second processing device.
17. A computer-readable storage device including executable instructions, that when executed on at least one processor, causing the processor to perform a process comprising:
launching a user interface for an application at a first scaled model based on detecting a display class associated with a first processing device, wherein the display class of the first processing device is detected based on a determined display size of the first processing device;
detecting connection of a second processing device;
determining a display class associated with the second processing device, wherein the display class of the second processing device is detected based on a determined display size of the second processing device; and
adapting the user interface to display, on the second processing device, a second scaled model designed for the second processing device upon determining that the display class of the second processing device is different from the display class of the first processing device.
18. The computer-readable storage device according to claim 17, wherein the display class of the first processing device and the display class of the second processing device are determined from a plurality of display classes based on a display size associated with a processing device falling within a range value set for display sizes of processing devices.
19. The computer-readable storage device according to claim 17, wherein a display size is determined by evaluating an effective resolution of a display of a processing device.
20. The computer-readable storage device according to claim 17, wherein a display size is determined by evaluating at least one of a screen diagonal of a display of a processing device and a display width of a display of a processing device.
US14/726,868 2014-11-06 2015-06-01 User interface scaling for devices based on display size Abandoned US20160132992A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/726,868 US20160132992A1 (en) 2014-11-06 2015-06-01 User interface scaling for devices based on display size

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201462076368P 2014-11-06 2014-11-06
US14/726,868 US20160132992A1 (en) 2014-11-06 2015-06-01 User interface scaling for devices based on display size

Publications (1)

Publication Number Publication Date
US20160132992A1 true US20160132992A1 (en) 2016-05-12

Family

ID=55912229

Family Applications (4)

Application Number Title Priority Date Filing Date
US14/726,868 Abandoned US20160132992A1 (en) 2014-11-06 2015-06-01 User interface scaling for devices based on display size
US14/727,226 Abandoned US20160132301A1 (en) 2014-11-06 2015-06-01 Programmatic user interface generation based on display size
US14/840,360 Active 2037-07-24 US11126329B2 (en) 2014-11-06 2015-08-31 Application command control for smaller screen display
US14/880,768 Active 2038-01-03 US11422681B2 (en) 2014-11-06 2015-10-12 User interface for application command control

Family Applications After (3)

Application Number Title Priority Date Filing Date
US14/727,226 Abandoned US20160132301A1 (en) 2014-11-06 2015-06-01 Programmatic user interface generation based on display size
US14/840,360 Active 2037-07-24 US11126329B2 (en) 2014-11-06 2015-08-31 Application command control for smaller screen display
US14/880,768 Active 2038-01-03 US11422681B2 (en) 2014-11-06 2015-10-12 User interface for application command control

Country Status (2)

Country Link
US (4) US20160132992A1 (en)
WO (1) WO2017065988A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180074999A1 (en) * 2016-09-14 2018-03-15 Pti Marketing Technologies Inc. Systems and methods for automatically reformatting publications
US20180335906A1 (en) * 2017-05-19 2018-11-22 Beijing Kingsoft Internet Security Software Co., Ltd. Application icon previewing method and device, and electronic device
CN109308205A (en) * 2018-08-09 2019-02-05 腾讯科技(深圳)有限公司 Display adaptation method, device, equipment and the storage medium of application program
USD844637S1 (en) * 2018-01-17 2019-04-02 Apple Inc. Electronic device with animated graphical user interface
USD847202S1 (en) * 2013-06-28 2019-04-30 Michael Flynn Portion of a communications terminal display screen with a dynamic icon
US10725632B2 (en) 2013-03-15 2020-07-28 Microsoft Technology Licensing, Llc In-place contextual menu for handling actions for a listing of items
US10949075B2 (en) 2014-11-06 2021-03-16 Microsoft Technology Licensing, Llc Application command control for small screen display
US11126329B2 (en) 2014-11-06 2021-09-21 Microsoft Technology Licensing, Llc Application command control for smaller screen display
USD936671S1 (en) * 2017-10-23 2021-11-23 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
US11340959B2 (en) * 2019-10-29 2022-05-24 Lg Electronics Inc. Electronic apparatus for running application and control method thereof
US20240028351A1 (en) * 2021-04-05 2024-01-25 Microsoft Technology Licensing, Llc Management of user interface elements based on historical configuration data
US11928417B2 (en) * 2016-06-10 2024-03-12 Truecontext Inc. Flexible online form display

Families Citing this family (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD780198S1 (en) * 2013-09-18 2017-02-28 Lenovo (Beijing) Co., Ltd. Display screen with graphical user interface
USD801995S1 (en) * 2015-03-06 2017-11-07 Samsung Electronics Co., Ltd Display screen or portion thereof with graphical user interface
US10303350B2 (en) * 2015-05-20 2019-05-28 Hubin Jiang Systems and methods for generating online documents
CN104954869B (en) * 2015-05-22 2018-05-29 合肥杰发科技有限公司 Multi-medium play method and device based on Android system
US10455056B2 (en) * 2015-08-21 2019-10-22 Abobe Inc. Cloud-based storage and interchange mechanism for design elements
US10496241B2 (en) 2015-08-21 2019-12-03 Adobe Inc. Cloud-based inter-application interchange of style information
USD795891S1 (en) * 2015-11-09 2017-08-29 Aetna Inc. Computer display screen for a server maintenance tool with graphical user interface
USD786890S1 (en) * 2015-11-09 2017-05-16 Aetna Inc. Computer display screen for a server maintenance tool with graphical user interface
USD772250S1 (en) * 2015-11-09 2016-11-22 Aetna Inc. Computer display for a server maintenance tool graphical user interface
WO2017156496A1 (en) * 2016-03-11 2017-09-14 Post Oak Today LLC Methods and apparatus for establishing shared memory spaces for data access and distribution
USD808410S1 (en) * 2016-06-03 2018-01-23 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD809557S1 (en) 2016-06-03 2018-02-06 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional graphical user interface
US10496419B2 (en) * 2016-06-10 2019-12-03 Apple Inc. Editing inherited configurations
USD916762S1 (en) * 2016-07-14 2021-04-20 Nasdaq, Inc. Display screen or portion thereof with animated graphical user interface
USD816710S1 (en) * 2016-07-20 2018-05-01 Multilearning Group, Inc. Mobile device display screen with transitional graphical user interface
US10409487B2 (en) 2016-08-23 2019-09-10 Microsoft Technology Licensing, Llc Application processing based on gesture input
US11816459B2 (en) * 2016-11-16 2023-11-14 Native Ui, Inc. Graphical user interface programming system
USD817350S1 (en) * 2016-11-22 2018-05-08 Otis Elevator Company Display screen or portion thereof with graphical user interface
CN108475096A (en) * 2016-12-23 2018-08-31 北京金山安全软件有限公司 Information display method and device and terminal equipment
WO2018209106A2 (en) 2017-05-10 2018-11-15 Embee Mobile, Inc. System and method for the capture of mobile behavior, usage, or content exposure
KR102378953B1 (en) * 2017-06-16 2022-03-24 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 Create a rule-based user interface
USD862509S1 (en) * 2017-08-23 2019-10-08 Amazon Technologies, Inc. Display screen or portion thereof having a graphical user interface
KR102029980B1 (en) * 2017-08-31 2019-10-08 한국전자통신연구원 Apparatus and method of generating alternative text
US20190087389A1 (en) * 2017-09-18 2019-03-21 Elutions IP Holdings S.à.r.l. Systems and methods for configuring display layout
GB2566949B (en) * 2017-09-27 2020-09-09 Avecto Ltd Computer device and method for managing privilege delegation
USD877752S1 (en) 2018-03-16 2020-03-10 Magic Leap, Inc. Display panel or portion thereof with graphical user interface
USD853426S1 (en) * 2018-03-28 2019-07-09 Manitowoc Crane Companies, Llc Mobile communication device display screen or portion thereof with graphical user interface
USD852833S1 (en) * 2018-03-28 2019-07-02 Manitowoc Crane Companies, Llc Mobile communication device display screen or portion thereof with graphical user interface
USD852834S1 (en) * 2018-03-28 2019-07-02 Manitowoc Crane Companies, Llc Mobile communication device display screen or portion thereof with graphical user interface
USD852835S1 (en) * 2018-03-28 2019-07-02 Manitowoc Crane Companies, Llc Mobile communication device display screen or portion thereof with graphical user interface
USD853428S1 (en) * 2018-03-28 2019-07-09 Manitowoc Crane Companies, Llc Mobile communication device display screen or portion thereof with graphical user interface
USD853427S1 (en) * 2018-03-28 2019-07-09 Manitowoc Crane Companies, Llc Mobile communication device display screen or portion thereof with graphical user interface
USD845988S1 (en) * 2018-03-28 2019-04-16 Manitowoc Crane Companies, Llc Mobile communication device display screen or portion thereof with graphical user interface
USD845987S1 (en) * 2018-03-28 2019-04-16 Manitowoc Crane Companies, Llc Mobile communication device display screen or portion thereof with graphical user interface
CN108549522A (en) * 2018-03-30 2018-09-18 深圳市万普拉斯科技有限公司 It takes pictures setting method, device, mobile terminal and computer readable storage medium
US10936163B2 (en) 2018-07-17 2021-03-02 Methodical Mind, Llc. Graphical user interface system
US10949174B2 (en) * 2018-10-31 2021-03-16 Salesforce.Com, Inc. Automatic classification of user interface elements
US11017045B2 (en) * 2018-11-19 2021-05-25 Microsoft Technology Licensing, Llc Personalized user experience and search-based recommendations
USD920996S1 (en) * 2019-02-22 2021-06-01 Teva Branded Pharmaceutical Products R&D, Inc. Display screen with a graphical user interface
USD913301S1 (en) * 2019-02-22 2021-03-16 Teva Branded Pharmaceutical Products R&D, Inc. Display screen with a graphical user interface
AU2021211470A1 (en) * 2020-01-22 2022-09-15 Methodical Mind, Llc. Graphical user interface system
US11231834B2 (en) 2020-06-03 2022-01-25 Micron Technology, Inc. Vehicle having an intelligent user interface
CN116360725B (en) * 2020-07-21 2024-02-23 华为技术有限公司 Display interaction system, display method and device
KR20220012599A (en) * 2020-07-23 2022-02-04 삼성전자주식회사 Apparatus and method for providing content search using keypad in electronic device
US11972095B2 (en) 2021-03-23 2024-04-30 Microsoft Technology Licensing, Llc Voice assistant-enabled client application with user view context and multi-modal input support
US12050841B2 (en) * 2021-03-23 2024-07-30 Microsoft Technology Licensing, Llc Voice assistant-enabled client application with user view context
US11789696B2 (en) * 2021-03-23 2023-10-17 Microsoft Technology Licensing, Llc Voice assistant-enabled client application with user view context

Citations (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030007006A1 (en) * 2001-06-12 2003-01-09 David Baar Graphical user interface with zoom for detail-in-context presentations
US20040163046A1 (en) * 2001-09-28 2004-08-19 Chu Hao-Hua Dynamic adaptation of GUI presentations to heterogeneous device platforms
US20050246647A1 (en) * 2004-04-30 2005-11-03 Microsoft Corporation System and method for selecting a view mode using a control including a graphical depiction of the view mode
US20060020899A1 (en) * 2004-04-26 2006-01-26 Microsoft Corporation Scaling icons for representing files
US20060082518A1 (en) * 2004-10-19 2006-04-20 Pranil Ram Multiple monitor display apparatus
US20080002115A1 (en) * 2006-06-30 2008-01-03 Motorola, Inc. Display stack-up for a mobile electronic device having internal and external displays
US20080273297A1 (en) * 1999-04-07 2008-11-06 Rajendra Kumar Portable computing, communication and entertainment device with central processor carried in a detachable portable device
US20090058885A1 (en) * 2007-08-27 2009-03-05 Samsung Electronics Co., Ltd. Adaptive video processing apparatus and method of scaling video based on screen size of display device
US20090303676A1 (en) * 2008-04-01 2009-12-10 Yves Behar System and method for streamlining user interaction with electronic content
US20100060587A1 (en) * 2007-01-31 2010-03-11 Joseph Michael Freund Handheld Device with Multiple Displays
US20100138780A1 (en) * 2008-05-20 2010-06-03 Adam Marano Methods and systems for using external display devices with a mobile computing device
US20110242750A1 (en) * 2010-04-01 2011-10-06 Oakley Nicholas W Accessible display in device with closed lid
US20120030584A1 (en) * 2010-07-30 2012-02-02 Brian Bian Method and apparatus for dynamically switching between scalable graphical user interfaces for mobile devices
US20120240056A1 (en) * 2010-11-17 2012-09-20 Paul Webber Email client mode transitions in a smartpad device
US20120287114A1 (en) * 2011-05-11 2012-11-15 Microsoft Corporation Interface including views positioned in along multiple dimensions
US20140032534A1 (en) * 2007-05-25 2014-01-30 Veveo, Inc. System and method for text disambiguation and context designation in incremental search
US20140040781A1 (en) * 2008-10-13 2014-02-06 Lewis Epstein Egalitarian Control Apparatus and Method for Sharing Information in a Collaborative Workspace
US20140143708A1 (en) * 2011-07-06 2014-05-22 Tencent Technology (Shenzhen) Company Limited Desktop Switching Method And Device
US20140310643A1 (en) * 2010-12-10 2014-10-16 Yota Devices Ipr Ltd. Mobile device with user interface
US20140325054A1 (en) * 2010-11-09 2014-10-30 Vmware, Inc. Remote Display Performance Measurement Triggered By Application Display Upgrade
US20150061968A1 (en) * 2013-08-29 2015-03-05 Samsung Electronics Co., Ltd. User terminal apparatus, method for controlling user terminal apparatus thereof, and expanded display system
US20150088669A1 (en) * 2012-08-16 2015-03-26 SK Planet Co., Ltd Apparatus and method for providing responsive user interface and electronic device-readable recording medium therefor
US20150143271A1 (en) * 2013-11-15 2015-05-21 Microsoft Corporation Remote control for displaying application data on dissimilar screens
US20150277682A1 (en) * 2014-04-01 2015-10-01 Microsoft Corporation Scalable user interface display
US20150286359A1 (en) * 2012-12-28 2015-10-08 Nicholas W. Oakley Dual configuration computer
US9160915B1 (en) * 2013-01-09 2015-10-13 Amazon Technologies, Inc. Modifying device functionality based on device orientation
US20160209973A1 (en) * 2015-01-21 2016-07-21 Microsoft Technology Licensing, Llc. Application user interface reconfiguration based on an experience mode transition
US20160209994A1 (en) * 2015-01-21 2016-07-21 Microsoft Technology Licensing, Llc. Adaptable user interface display
US20160320938A9 (en) * 2009-03-17 2016-11-03 Litera Technologies, LLC System and Method for the Auto-Detection and Presentation of Pre-Set Configurations for Multiple Monitor Layout Display
US20160364219A9 (en) * 2012-03-26 2016-12-15 Greyheller, Llc Dynamically optimized content display

Family Cites Families (178)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2693810B1 (en) 1991-06-03 1997-01-10 Apple Computer USER INTERFACE SYSTEMS WITH DIRECT ACCESS TO A SECONDARY DISPLAY AREA.
US5371844A (en) * 1992-03-20 1994-12-06 International Business Machines Corporation Palette manager in a graphical user interface computer system
US5420605A (en) 1993-02-26 1995-05-30 Binar Graphics, Inc. Method of resetting a computer video display mode
US5499334A (en) 1993-03-01 1996-03-12 Microsoft Corporation Method and system for displaying window configuration of inactive programs
US5666498A (en) * 1996-03-29 1997-09-09 International Business Machines Corporation Method, memory and apparatus for automatically resizing a window
US5920315A (en) * 1996-07-17 1999-07-06 International Business Machines Corporation Multi-pane window with recoiling workspaces
US5796401A (en) 1996-08-09 1998-08-18 Winer; Peter W. System for designing dynamic layouts adaptable to various display screen sizes and resolutions
US5760772A (en) * 1996-08-30 1998-06-02 Novell, Inc. Method for automatically resizing a child window
US6023714A (en) * 1997-04-24 2000-02-08 Microsoft Corporation Method and system for dynamically adapting the layout of a document to an output device
US5886694A (en) 1997-07-14 1999-03-23 Microsoft Corporation Method for automatically laying out controls in a dialog window
US6433801B1 (en) * 1997-09-26 2002-08-13 Ericsson Inc. Method and apparatus for using a touch screen display on a portable intelligent communications device
US6018346A (en) * 1998-01-12 2000-01-25 Xerox Corporation Freeform graphics system having meeting objects for supporting meeting objectives
US6300947B1 (en) * 1998-07-06 2001-10-09 International Business Machines Corporation Display screen and window size related web page adaptation system
US6335743B1 (en) 1998-08-11 2002-01-01 International Business Machines Corporation Method and system for providing a resize layout allowing flexible placement and sizing of controls
US6342907B1 (en) 1998-10-19 2002-01-29 International Business Machines Corporation Specification language for defining user interface panels that are platform-independent
US6392836B1 (en) 1999-01-15 2002-05-21 Seagate Removable Storage Solutions Llc Tape cartridge-loading mechanism
US6538665B2 (en) 1999-04-15 2003-03-25 Apple Computer, Inc. User interface for presenting media information
US7624356B1 (en) * 2000-06-21 2009-11-24 Microsoft Corporation Task-sensitive methods and systems for displaying command sets
GB0019459D0 (en) 2000-07-28 2000-09-27 Symbian Ltd Computing device with improved user interface for applications
US6734882B1 (en) 2000-09-29 2004-05-11 Apple Computer, Inc. Combined menu-list control element in a graphical user interface
US6640655B1 (en) 2000-10-03 2003-11-04 Varco I/P, Inc. Self tracking sensor suspension mechanism
US6978473B1 (en) * 2000-10-27 2005-12-20 Sony Corporation Pop-up option palette
US7028306B2 (en) 2000-12-04 2006-04-11 International Business Machines Corporation Systems and methods for implementing modular DOM (Document Object Model)-based multi-modal browsers
US7493568B2 (en) * 2001-01-26 2009-02-17 Microsoft Corporation System and method for browsing properties of an electronic document
US6791581B2 (en) 2001-01-31 2004-09-14 Microsoft Corporation Methods and systems for synchronizing skin properties
US7155681B2 (en) 2001-02-14 2006-12-26 Sproqit Technologies, Inc. Platform-independent distributed user interface server architecture
GB0105994D0 (en) * 2001-03-10 2001-05-02 Pace Micro Tech Plc Video display resizing
WO2003005186A1 (en) 2001-07-05 2003-01-16 Fujitsu Limited Start up of application on information processor by means of portable unit
US6950993B2 (en) 2001-08-02 2005-09-27 Microsoft Corporation System and method for automatic and dynamic layout of resizable dialog type windows
US8966379B2 (en) 2010-10-01 2015-02-24 Z124 Dynamic cross-environment application configuration/orientation in an active user environment
US20130024778A1 (en) 2011-07-13 2013-01-24 Z124 Dynamic cross-environment application configuration/orientation
US7895522B2 (en) 2001-09-28 2011-02-22 Ntt Docomo, Inc. Layout of platform specific graphical user interface widgets migrated between heterogeneous device platforms
US7392483B2 (en) 2001-09-28 2008-06-24 Ntt Docomo, Inc, Transformation of platform specific graphical user interface widgets migrated between heterogeneous device platforms
US20030063120A1 (en) 2001-09-28 2003-04-03 Wong Hoi Lee Candy Scalable graphical user interface architecture
US20050066037A1 (en) 2002-04-10 2005-03-24 Yu Song Browser session mobility system for multi-platform applications
US20080313282A1 (en) 2002-09-10 2008-12-18 Warila Bruce W User interface, operating system and architecture
US20040056894A1 (en) 2002-09-19 2004-03-25 Igor Zaika System and method for describing and instantiating extensible user interfaces
US7574669B1 (en) * 2002-10-08 2009-08-11 Microsoft Corporation User interface control for navigating, selecting, and organizing document pages
US20040075693A1 (en) 2002-10-21 2004-04-22 Moyer Timothy A. Compact method of navigating hierarchical menus on an electronic device having a small display screen
US20040153973A1 (en) * 2002-11-21 2004-08-05 Lawrence Horwitz System and method for automatically storing and recalling application states based on application contexts
US8418081B2 (en) * 2002-12-18 2013-04-09 International Business Machines Corporation Optimizing display space with expandable and collapsible user interface controls
US20040223004A1 (en) 2003-05-05 2004-11-11 Lincke Scott D. System and method for implementing a landscape user experience in a hand-held computing device
US7308288B2 (en) 2003-08-22 2007-12-11 Sbc Knowledge Ventures, Lp. System and method for prioritized interface design
US7395500B2 (en) * 2003-08-29 2008-07-01 Yahoo! Inc. Space-optimizing content display
US20050055645A1 (en) * 2003-09-09 2005-03-10 Mitutoyo Corporation System and method for resizing tiles on a computer display
KR101068509B1 (en) 2003-09-24 2011-09-28 노키아 코포레이션 Improved presentation of large objects on small displays
US7418670B2 (en) 2003-10-03 2008-08-26 Microsoft Corporation Hierarchical in-place menus
US8302020B2 (en) * 2004-06-25 2012-10-30 Apple Inc. Widget authoring and editing environment
US8255828B2 (en) 2004-08-16 2012-08-28 Microsoft Corporation Command user interface for displaying selectable software functionality controls
US7895531B2 (en) 2004-08-16 2011-02-22 Microsoft Corporation Floating command object
US8169410B2 (en) 2004-10-20 2012-05-01 Nintendo Co., Ltd. Gesture inputs for a portable display device
US7812786B2 (en) 2005-01-18 2010-10-12 Nokia Corporation User interface for different displays
US7752633B1 (en) 2005-03-14 2010-07-06 Seven Networks, Inc. Cross-platform event engine
US7512904B2 (en) 2005-03-22 2009-03-31 Microsoft Corporation Operating system launch menu program listing
US9043719B2 (en) * 2005-04-08 2015-05-26 New York Stock Exchange Llc System and method for managing and displaying securities market information
US20060236264A1 (en) 2005-04-18 2006-10-19 Microsoft Corporation Automatic window resize behavior and optimizations
US7432928B2 (en) * 2005-06-14 2008-10-07 Microsoft Corporation User interface state reconfiguration through animation
US8392836B1 (en) 2005-07-11 2013-03-05 Google Inc. Presenting quick list of contacts to communication application user
US8689137B2 (en) 2005-09-07 2014-04-01 Microsoft Corporation Command user interface for displaying selectable functionality controls in a database application
US7673233B2 (en) * 2005-09-08 2010-03-02 Microsoft Corporation Browser tab management
CA2621488A1 (en) * 2005-09-13 2007-03-22 Spacetime3D, Inc. System and method for providing three-dimensional graphical user interface
US8904286B2 (en) 2006-02-13 2014-12-02 Blackberry Limited Method and arrangement for providing a primary actions menu on a wireless handheld communication device
US8635553B2 (en) * 2006-02-16 2014-01-21 Adobe Systems Incorporated Auto adjustable pane view
US20090278806A1 (en) 2008-05-06 2009-11-12 Matias Gonzalo Duarte Extended touch-sensitive control area for electronic device
US20070266335A1 (en) 2006-05-12 2007-11-15 Microsoft Corporation Providing a standard user interface (UI) across disparate display interfaces
KR100825871B1 (en) 2006-06-28 2008-04-28 삼성전자주식회사 Method and Apparatus for providing User Interface in a Terminal having Touch Pad
US8564544B2 (en) 2006-09-06 2013-10-22 Apple Inc. Touch screen device, method, and graphical user interface for customizing display of content category icons
US7934156B2 (en) 2006-09-06 2011-04-26 Apple Inc. Deletion gestures on a portable multifunction device
US20080163112A1 (en) 2006-12-29 2008-07-03 Research In Motion Limited Designation of menu actions for applications on a handheld electronic device
US8108763B2 (en) * 2007-01-19 2012-01-31 Constant Contact, Inc. Visual editor for electronic mail
WO2008115553A1 (en) 2007-03-20 2008-09-25 The University Of North Carolina At Chapel Hill Methods, systems, and computer readable media for automatically generating customizable user interfaces using proagramming patterns
AR067297A1 (en) 2007-03-28 2009-10-07 Avery Dennison Corp TAPE TYPE USER INTERFACE FOR AN APPLICATION PROGRAM
US8276069B2 (en) 2007-03-28 2012-09-25 Honeywell International Inc. Method and system for automatically generating an adaptive user interface for a physical environment
US8059101B2 (en) 2007-06-22 2011-11-15 Apple Inc. Swipe gestures for touch screen keyboards
US8478245B2 (en) 2007-08-01 2013-07-02 Phunware, Inc. Method and system for rendering content on a wireless device
US7949954B1 (en) 2007-08-17 2011-05-24 Trading Technologies International, Inc. Dynamic functionality based on window characteristics
US7917859B1 (en) * 2007-09-21 2011-03-29 Adobe Systems Incorporated Dynamic user interface elements
US20090192849A1 (en) 2007-11-09 2009-07-30 Hughes John M System and method for software development
US8078979B2 (en) 2007-11-27 2011-12-13 Microsoft Corporation Web page editor with element selection mechanism
US20090140977A1 (en) 2007-11-30 2009-06-04 Microsoft Corporation Common User Interface Structure
JP4364273B2 (en) 2007-12-28 2009-11-11 パナソニック株式会社 Portable terminal device, display control method, and display control program
WO2009126591A1 (en) * 2008-04-07 2009-10-15 Express Mobile, Inc. Systems and methods for programming mobile devices
US8085265B2 (en) 2008-04-23 2011-12-27 Honeywell International Inc. Methods and systems of generating 3D user interface for physical environment
KR101461954B1 (en) 2008-05-08 2014-11-14 엘지전자 주식회사 Terminal and method for controlling the same
US7930343B2 (en) 2008-05-16 2011-04-19 Honeywell International Inc. Scalable user interface system
TW201001267A (en) 2008-06-20 2010-01-01 Amtran Technology Co Ltd Electronic apparatus with screen displayed menu and its generation method
US20100122215A1 (en) 2008-11-11 2010-05-13 Qwebl, Inc. Control interface for home automation system
US8302026B2 (en) 2008-11-28 2012-10-30 Microsoft Corporation Multi-panel user interface
US8638311B2 (en) 2008-12-08 2014-01-28 Samsung Electronics Co., Ltd. Display device and data displaying method thereof
US8274536B2 (en) 2009-03-16 2012-09-25 Apple Inc. Smart keyboard management for a multifunction device with a touch screen display
EP2237140B1 (en) 2009-03-31 2018-12-26 Lg Electronics Inc. Mobile terminal and controlling method thereof
US9335916B2 (en) 2009-04-15 2016-05-10 International Business Machines Corporation Presenting and zooming a set of objects within a window
US9298336B2 (en) 2009-05-28 2016-03-29 Apple Inc. Rotation smoothing of a user interface
US8806331B2 (en) * 2009-07-20 2014-08-12 Interactive Memories, Inc. System and methods for creating and editing photo-based projects on a digital network
CN101996018A (en) 2009-08-17 2011-03-30 张学志 Novel vertical ribbon graphic user interface
US9465786B2 (en) 2009-08-25 2016-10-11 Keeper Security, Inc. Method for facilitating quick logins from a mobile device
US9116615B2 (en) 2009-10-13 2015-08-25 Blackberry Limited User interface for a touchscreen display
US8490018B2 (en) 2009-11-17 2013-07-16 International Business Machines Corporation Prioritization of choices based on context and user history
US8627230B2 (en) 2009-11-24 2014-01-07 International Business Machines Corporation Intelligent command prediction
US20120266069A1 (en) 2009-12-28 2012-10-18 Hillcrest Laboratories, Inc. TV Internet Browser
US9052894B2 (en) 2010-01-15 2015-06-09 Apple Inc. API to replace a keyboard with custom controls
WO2011108797A1 (en) 2010-03-03 2011-09-09 Lg Electronics Inc. Mobile terminal and control method thereof
US8799325B2 (en) 2010-03-12 2014-08-05 Microsoft Corporation Reordering nodes in a hierarchical structure
GB2479756B (en) 2010-04-21 2013-06-05 Realvnc Ltd Virtual interface devices
US8631350B2 (en) 2010-04-23 2014-01-14 Blackberry Limited Graphical context short menu
US9648279B2 (en) 2010-06-08 2017-05-09 Mitel Networks Corporation Method and system for video communication
US20110307804A1 (en) 2010-06-11 2011-12-15 Spierer Mitchell D Electronic message management system and method
CH703401B1 (en) * 2010-07-02 2019-04-30 Ferag Ag Method and device for generating a user interface for operating machines.
US20120017172A1 (en) * 2010-07-15 2012-01-19 Microsoft Corporation Display-agnostic user interface for mobile devices
US9483175B2 (en) * 2010-07-26 2016-11-01 Apple Inc. Device, method, and graphical user interface for navigating through a hierarchy
US9465457B2 (en) 2010-08-30 2016-10-11 Vmware, Inc. Multi-touch interface gestures for keyboard and/or mouse inputs
US8754860B2 (en) 2010-11-05 2014-06-17 Apple Inc. Device, method, and graphical user interface for manipulating soft keyboards
EP2697722A4 (en) 2011-04-13 2014-12-03 Blackberry Ltd System and method for context aware dynamic ribbon
US20130019150A1 (en) * 2011-07-13 2013-01-17 Rony Zarom System and method for automatic and dynamic layout design for media broadcast
US9582187B2 (en) 2011-07-14 2017-02-28 Microsoft Technology Licensing, Llc Dynamic context based menus
US9086794B2 (en) 2011-07-14 2015-07-21 Microsoft Technology Licensing, Llc Determining gestures on context based menus
US8707289B2 (en) * 2011-07-20 2014-04-22 Google Inc. Multiple application versions
US20130036443A1 (en) 2011-08-03 2013-02-07 Verizon Patent And Licensing Inc. Interactive and program half-screen
KR101862123B1 (en) 2011-08-31 2018-05-30 삼성전자 주식회사 Input device and method on terminal equipment having a touch module
US8909298B2 (en) 2011-09-30 2014-12-09 Samsung Electronics Co., Ltd. Apparatus and method for mobile screen navigation
US9760236B2 (en) * 2011-10-14 2017-09-12 Georgia Tech Research Corporation View virtualization and transformations for mobile applications
US9360998B2 (en) 2011-11-01 2016-06-07 Paypal, Inc. Selection and organization based on selection of X-Y position
WO2013067618A1 (en) 2011-11-09 2013-05-16 Research In Motion Limited Touch-sensitive display method and apparatus
US8881032B1 (en) * 2011-12-07 2014-11-04 Google Inc. Grouped tab document interface
KR20130064478A (en) 2011-12-08 2013-06-18 삼성전자주식회사 User terminal device and method for displaying background screen thereof
ES2691471T3 (en) * 2011-12-19 2018-11-27 Orange Method for notification of events on a device that executes identities of multiple users
US20130159917A1 (en) * 2011-12-20 2013-06-20 Lenovo (Singapore) Pte. Ltd. Dynamic user interface based on connected devices
JP2015508357A (en) 2012-01-09 2015-03-19 エアビクティ インコーポレイテッド User interface for mobile devices
US20130212487A1 (en) * 2012-01-09 2013-08-15 Visa International Service Association Dynamic Page Content and Layouts Apparatuses, Methods and Systems
US20130191781A1 (en) 2012-01-20 2013-07-25 Microsoft Corporation Displaying and interacting with touch contextual user interface
US9250768B2 (en) * 2012-02-13 2016-02-02 Samsung Electronics Co., Ltd. Tablet having user interface
EP2631762A1 (en) 2012-02-24 2013-08-28 Research In Motion Limited Method and apparatus for providing an option to enable multiple selections
US8539375B1 (en) 2012-02-24 2013-09-17 Blackberry Limited Method and apparatus for providing a user interface on a device enabling selection of operations to be performed in relation to content
US20130227413A1 (en) 2012-02-24 2013-08-29 Simon Martin THORSANDER Method and Apparatus for Providing a Contextual User Interface on a Device
EP2631761A1 (en) 2012-02-24 2013-08-28 Research In Motion Limited Method and apparatus for providing an option to undo a delete operation
US9081498B2 (en) 2012-02-24 2015-07-14 Blackberry Limited Method and apparatus for adjusting a user interface to reduce obscuration
EP2631747B1 (en) 2012-02-24 2016-03-30 BlackBerry Limited Method and apparatus for providing a user interface on a device that indicates content operators
DE102012005054A1 (en) 2012-03-15 2013-09-19 Volkswagen Aktiengesellschaft Method, mobile device and infotainment system for projecting a user interface on a screen
US10673691B2 (en) 2012-03-24 2020-06-02 Fred Khosropour User interaction platform
US9146655B2 (en) 2012-04-06 2015-09-29 Samsung Electronics Co., Ltd. Method and device for executing object on display
US9021371B2 (en) 2012-04-20 2015-04-28 Logitech Europe S.A. Customizing a user interface having a plurality of top-level icons based on a change in context
US8937636B2 (en) 2012-04-20 2015-01-20 Logitech Europe S.A. Using previous selection information in a user interface having a plurality of icons
US20130285926A1 (en) * 2012-04-30 2013-10-31 Research In Motion Limited Configurable Touchscreen Keyboard
US20140013271A1 (en) * 2012-07-05 2014-01-09 Research In Motion Limited Prioritization of multitasking applications in a mobile device interface
US20140189586A1 (en) 2012-12-28 2014-07-03 Spritz Technology Llc Methods and systems for displaying text using rsvp
US9256351B2 (en) 2012-07-20 2016-02-09 Blackberry Limited Method and electronic device for facilitating user control of a menu
US20140033110A1 (en) 2012-07-26 2014-01-30 Texas Instruments Incorporated Accessing Secondary Functions on Soft Keyboards Using Gestures
US20140055495A1 (en) * 2012-08-22 2014-02-27 Lg Cns Co., Ltd. Responsive user interface engine for display devices
US9329778B2 (en) 2012-09-07 2016-05-03 International Business Machines Corporation Supplementing a virtual input keyboard
US9729695B2 (en) 2012-11-20 2017-08-08 Dropbox Inc. Messaging client application interface
US9755995B2 (en) 2012-11-20 2017-09-05 Dropbox, Inc. System and method for applying gesture input to digital content
US9588674B2 (en) 2012-11-30 2017-03-07 Qualcomm Incorporated Methods and systems for providing an automated split-screen user interface on a device
US9652109B2 (en) 2013-01-11 2017-05-16 Microsoft Technology Licensing, Llc Predictive contextual toolbar for productivity applications
US9280523B2 (en) * 2013-01-23 2016-03-08 Go Daddy Operating Company, LLC System for conversion of website content
WO2014117241A1 (en) 2013-02-04 2014-08-07 602531 British Columbia Ltd. Data retrieval by way of context-sensitive icons
US10025459B2 (en) 2013-03-14 2018-07-17 Airwatch Llc Gesture-based workflow progression
US9792014B2 (en) 2013-03-15 2017-10-17 Microsoft Technology Licensing, Llc In-place contextual menu for handling actions for a listing of items
US20140282178A1 (en) 2013-03-15 2014-09-18 Microsoft Corporation Personalized community model for surfacing commands within productivity application user interfaces
WO2014152136A1 (en) 2013-03-15 2014-09-25 Beeonics, Inc. Method for single workflow for multi-platform mobile application creation and delivery
US20140282055A1 (en) * 2013-03-15 2014-09-18 Agilent Technologies, Inc. Layout System for Devices with Variable Display Screen Sizes and Orientations
US9304665B2 (en) 2013-04-05 2016-04-05 Yahoo! Inc. Method and apparatus for facilitating message selection and organization
US10249018B2 (en) 2013-04-25 2019-04-02 Nvidia Corporation Graphics processor and method of scaling user interface elements for smaller displays
US20140325345A1 (en) * 2013-04-26 2014-10-30 Amazon Technologies, Inc. Consistent Scaling of Web-Based Content Across Devices Having Different Screen Metrics
US9501500B2 (en) 2013-05-10 2016-11-22 Tencent Technology (Shenzhen) Company Limited Systems and methods for image file processing
TW201445403A (en) * 2013-05-17 2014-12-01 Global Lighting Technologies Multifunction input device
US20150033188A1 (en) 2013-07-23 2015-01-29 Microsoft Corporation Scrollable smart menu
US9311422B2 (en) 2013-09-12 2016-04-12 Adobe Systems Incorporated Dynamic simulation of a responsive web page
US9519401B2 (en) 2013-09-18 2016-12-13 Adobe Systems Incorporated Providing context menu based on predicted commands
US20150095767A1 (en) * 2013-10-02 2015-04-02 Rachel Ebner Automatic generation of mobile site layouts
US9507520B2 (en) 2013-12-16 2016-11-29 Microsoft Technology Licensing, Llc Touch-based reorganization of page element
JP5929883B2 (en) * 2013-12-18 2016-06-08 コニカミノルタ株式会社 Screen generation device, remote operation device, remote control device, screen generation method, and screen generation program
KR20150099324A (en) 2014-02-21 2015-08-31 삼성전자주식회사 Method for romote control between electronic devices and system therefor
US20150277726A1 (en) * 2014-04-01 2015-10-01 Microsoft Corporation Sliding surface
US9658741B2 (en) 2014-04-25 2017-05-23 Rohde & Schwarz Gmbh & Co. Kg Measuring device and measuring method with interactive operation
US10949075B2 (en) 2014-11-06 2021-03-16 Microsoft Technology Licensing, Llc Application command control for small screen display
US20160132992A1 (en) 2014-11-06 2016-05-12 Microsoft Technology Licensing, Llc User interface scaling for devices based on display size
US10241975B2 (en) * 2015-04-02 2019-03-26 Apple Inc. Dynamically determining arrangement of a layout
US20180004544A1 (en) * 2016-06-30 2018-01-04 Sap Se Personalized run time user interfaces

Patent Citations (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080273297A1 (en) * 1999-04-07 2008-11-06 Rajendra Kumar Portable computing, communication and entertainment device with central processor carried in a detachable portable device
US20030007006A1 (en) * 2001-06-12 2003-01-09 David Baar Graphical user interface with zoom for detail-in-context presentations
US20040163046A1 (en) * 2001-09-28 2004-08-19 Chu Hao-Hua Dynamic adaptation of GUI presentations to heterogeneous device platforms
US20060020899A1 (en) * 2004-04-26 2006-01-26 Microsoft Corporation Scaling icons for representing files
US20050246647A1 (en) * 2004-04-30 2005-11-03 Microsoft Corporation System and method for selecting a view mode using a control including a graphical depiction of the view mode
US20060082518A1 (en) * 2004-10-19 2006-04-20 Pranil Ram Multiple monitor display apparatus
US20080002115A1 (en) * 2006-06-30 2008-01-03 Motorola, Inc. Display stack-up for a mobile electronic device having internal and external displays
US20100060587A1 (en) * 2007-01-31 2010-03-11 Joseph Michael Freund Handheld Device with Multiple Displays
US20140032534A1 (en) * 2007-05-25 2014-01-30 Veveo, Inc. System and method for text disambiguation and context designation in incremental search
US20130222434A1 (en) * 2007-08-27 2013-08-29 Samsung Electronics Co., Ltd. Adaptive video processing apparatus and method of scaling video based on screen size of display device
US20090058885A1 (en) * 2007-08-27 2009-03-05 Samsung Electronics Co., Ltd. Adaptive video processing apparatus and method of scaling video based on screen size of display device
US20090303676A1 (en) * 2008-04-01 2009-12-10 Yves Behar System and method for streamlining user interaction with electronic content
US20100138780A1 (en) * 2008-05-20 2010-06-03 Adam Marano Methods and systems for using external display devices with a mobile computing device
US20140040781A1 (en) * 2008-10-13 2014-02-06 Lewis Epstein Egalitarian Control Apparatus and Method for Sharing Information in a Collaborative Workspace
US20160320938A9 (en) * 2009-03-17 2016-11-03 Litera Technologies, LLC System and Method for the Auto-Detection and Presentation of Pre-Set Configurations for Multiple Monitor Layout Display
US20110242750A1 (en) * 2010-04-01 2011-10-06 Oakley Nicholas W Accessible display in device with closed lid
US20120030584A1 (en) * 2010-07-30 2012-02-02 Brian Bian Method and apparatus for dynamically switching between scalable graphical user interfaces for mobile devices
US20140325054A1 (en) * 2010-11-09 2014-10-30 Vmware, Inc. Remote Display Performance Measurement Triggered By Application Display Upgrade
US20120240056A1 (en) * 2010-11-17 2012-09-20 Paul Webber Email client mode transitions in a smartpad device
US20140310643A1 (en) * 2010-12-10 2014-10-16 Yota Devices Ipr Ltd. Mobile device with user interface
US20120287114A1 (en) * 2011-05-11 2012-11-15 Microsoft Corporation Interface including views positioned in along multiple dimensions
US20140143708A1 (en) * 2011-07-06 2014-05-22 Tencent Technology (Shenzhen) Company Limited Desktop Switching Method And Device
US20160364219A9 (en) * 2012-03-26 2016-12-15 Greyheller, Llc Dynamically optimized content display
US20150088669A1 (en) * 2012-08-16 2015-03-26 SK Planet Co., Ltd Apparatus and method for providing responsive user interface and electronic device-readable recording medium therefor
US20150286359A1 (en) * 2012-12-28 2015-10-08 Nicholas W. Oakley Dual configuration computer
US9160915B1 (en) * 2013-01-09 2015-10-13 Amazon Technologies, Inc. Modifying device functionality based on device orientation
US20150061968A1 (en) * 2013-08-29 2015-03-05 Samsung Electronics Co., Ltd. User terminal apparatus, method for controlling user terminal apparatus thereof, and expanded display system
US20150143271A1 (en) * 2013-11-15 2015-05-21 Microsoft Corporation Remote control for displaying application data on dissimilar screens
US20150277682A1 (en) * 2014-04-01 2015-10-01 Microsoft Corporation Scalable user interface display
US20160209973A1 (en) * 2015-01-21 2016-07-21 Microsoft Technology Licensing, Llc. Application user interface reconfiguration based on an experience mode transition
US20160209994A1 (en) * 2015-01-21 2016-07-21 Microsoft Technology Licensing, Llc. Adaptable user interface display

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Khalilbeigi et al., "FoldMe: Interacting with Double-sided Foldable Displays," Proceedings of the 6th International Conference on Tangible and Embedded Interaction 2012, Kingston, Ontario, Canada, February 19-22, 2012, https://hci.cs.uni-saarland.de/files/2012/11/p33-kahlilbeigi.pdf *

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10725632B2 (en) 2013-03-15 2020-07-28 Microsoft Technology Licensing, Llc In-place contextual menu for handling actions for a listing of items
USD847202S1 (en) * 2013-06-28 2019-04-30 Michael Flynn Portion of a communications terminal display screen with a dynamic icon
US11422681B2 (en) 2014-11-06 2022-08-23 Microsoft Technology Licensing, Llc User interface for application command control
US10949075B2 (en) 2014-11-06 2021-03-16 Microsoft Technology Licensing, Llc Application command control for small screen display
US11126329B2 (en) 2014-11-06 2021-09-21 Microsoft Technology Licensing, Llc Application command control for smaller screen display
US11928417B2 (en) * 2016-06-10 2024-03-12 Truecontext Inc. Flexible online form display
US12001776B2 (en) * 2016-09-14 2024-06-04 Pti Marketing Technologies Inc. Systems and methods for automatically reformatting publications
US20180074999A1 (en) * 2016-09-14 2018-03-15 Pti Marketing Technologies Inc. Systems and methods for automatically reformatting publications
US20180335906A1 (en) * 2017-05-19 2018-11-22 Beijing Kingsoft Internet Security Software Co., Ltd. Application icon previewing method and device, and electronic device
USD936671S1 (en) * 2017-10-23 2021-11-23 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD844637S1 (en) * 2018-01-17 2019-04-02 Apple Inc. Electronic device with animated graphical user interface
US11231845B2 (en) 2018-08-09 2022-01-25 Tencent Technology (Shenzhen) Company Limited Display adaptation method and apparatus for application, and storage medium
CN109308205A (en) * 2018-08-09 2019-02-05 腾讯科技(深圳)有限公司 Display adaptation method, device, equipment and the storage medium of application program
US11340959B2 (en) * 2019-10-29 2022-05-24 Lg Electronics Inc. Electronic apparatus for running application and control method thereof
US20240028351A1 (en) * 2021-04-05 2024-01-25 Microsoft Technology Licensing, Llc Management of user interface elements based on historical configuration data

Also Published As

Publication number Publication date
US11422681B2 (en) 2022-08-23
US20160132234A1 (en) 2016-05-12
WO2017065988A1 (en) 2017-04-20
US20160132301A1 (en) 2016-05-12
US20160132195A1 (en) 2016-05-12
US11126329B2 (en) 2021-09-21

Similar Documents

Publication Publication Date Title
US20160132992A1 (en) User interface scaling for devices based on display size
US10949075B2 (en) Application command control for small screen display
CN106164856B (en) Adaptive user interaction pane manager
US10684769B2 (en) Inset dynamic content preview pane
US10042655B2 (en) Adaptable user interface display
US20190155456A1 (en) Adaptive user interface pane objects
US20180225263A1 (en) Inline insertion viewport
US20150277682A1 (en) Scalable user interface display
CN112154427A (en) Progressive display user interface for collaborative documents
US20130125041A1 (en) Format Object Task Pane
US9792038B2 (en) Feedback via an input device and scribble recognition
US20140354554A1 (en) Touch Optimized UI
US10209864B2 (en) UI differentiation between delete and clear
US12056413B2 (en) Contextual workflow triggering on devices

Legal Events

Date Code Title Description
AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:RODRIG, MAYA;STEPANICH, DARRON;BOYD, PATRICK;AND OTHERS;SIGNING DATES FROM 20150526 TO 20150529;REEL/FRAME:035754/0165

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION