US20140176479A1 - Video Peeking - Google Patents
Video Peeking Download PDFInfo
- Publication number
- US20140176479A1 US20140176479A1 US14/236,105 US201214236105A US2014176479A1 US 20140176479 A1 US20140176479 A1 US 20140176479A1 US 201214236105 A US201214236105 A US 201214236105A US 2014176479 A1 US2014176479 A1 US 2014176479A1
- Authority
- US
- United States
- Prior art keywords
- swiping
- video
- screen
- video image
- motion
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000033001 locomotion Effects 0.000 claims description 95
- 238000000034 method Methods 0.000 claims description 33
- 238000010408 sweeping Methods 0.000 claims description 4
- 230000000977 initiatory effect Effects 0.000 claims description 3
- 238000012423 maintenance Methods 0.000 claims description 2
- 230000006870 function Effects 0.000 description 17
- 210000003811 finger Anatomy 0.000 description 13
- 238000003860 storage Methods 0.000 description 12
- 238000010586 diagram Methods 0.000 description 10
- 230000015654 memory Effects 0.000 description 10
- 230000003993 interaction Effects 0.000 description 9
- 230000008569 process Effects 0.000 description 8
- 230000004044 response Effects 0.000 description 8
- 230000000007 visual effect Effects 0.000 description 7
- 238000006243 chemical reaction Methods 0.000 description 5
- 230000007246 mechanism Effects 0.000 description 5
- 230000004913 activation Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 4
- 230000002452 interceptive effect Effects 0.000 description 4
- 238000005457 optimization Methods 0.000 description 4
- 230000007704 transition Effects 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000010079 rubber tapping Methods 0.000 description 3
- 210000003813 thumb Anatomy 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 239000003795 chemical substances by application Substances 0.000 description 2
- 230000008034 disappearance Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000003825 pressing Methods 0.000 description 2
- 238000000926 separation method Methods 0.000 description 2
- 230000016776 visual perception Effects 0.000 description 2
- 238000012800 visualization Methods 0.000 description 2
- 241000272194 Ciconiiformes Species 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000003416 augmentation Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000001680 brushing effect Effects 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 238000013479 data entry Methods 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 230000000994 depressogenic effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 210000004247 hand Anatomy 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/34—Indicating arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/438—Interfacing the downstream path of the transmission network originating from a server, e.g. retrieving encoded video stream packets from an IP network
- H04N21/4383—Accessing a communication channel
Definitions
- the background of the invention relates to the general operation of a touch activatable screen in a video tablet or other display device.
- the description of a touch activatable screen in a video tablet or other display device found herein can also be found in PCT/US2010/049772.
- Patents, published applications and articles cited in the initial International Search Report for PCT/US2010/049772, and thus relating to the general operation of a touch activatable screen include: US2009210819A1; US20070013708A1; US20080079972A1; US20090153478A1; US2009019924A1; EP1450277A2; and SHIRAZI J: “Java Performance Tuning—Chapter 4 Object Creation”.
- the elements shown in the figures can be implemented in various forms of hardware, software or combinations thereof. Preferably, these elements are implemented in a combination of hardware and software on one or more appropriately programmed general-purpose devices, which can include a processor, memory and input/output interfaces.
- general-purpose devices which can include a processor, memory and input/output interfaces.
- the phrase “coupled” is defined to mean directly connected to or indirectly connected with through one or more intermediate components. Such intermediate components can include both hardware and software based components.
- processor or “controller” should not be construed to refer exclusively to hardware capable of executing software, and can implicitly include, without limitation, digital signal processor (DSP) hardware, read only memory (ROM) for storing software, random access memory (RAM), and nonvolatile storage.
- DSP digital signal processor
- ROM read only memory
- RAM random access memory
- any switches shown in the figures are conceptual only. Their function can be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the implementer as more specifically understood from the context.
- any element expressed as a means for performing a specified function is intended to encompass any way of performing that function including, for example, a) a combination of circuit elements that performs that function or b) software in any form, including, therefore, firmware, microcode or the like, combined with appropriate circuitry for executing that software to perform the function.
- the disclosure as defined by such claims resides in the fact that the functionalities provided by the various recited means are combined and brought together in the manner which the claims call for. It is thus regarded that any means that can provide those functionalities are equivalent to those shown herein.
- the present disclosure provides for a grid display which is a graphical user interface view that enables a user to navigate a set of data elements in two dimensional space (i.e., x and y directions).
- the gird display can have a two dimensional (2D) pattern, e.g., columns and rows, but can take other forms. Navigation of the grid display can be accomplished using commends, such as gestures, to locate the desired element. An entry in the grid display is tapped or otherwise selected to initiate further action, e.g., executing or playing associated content.
- This interface mechanism is for use in media applications where items on the gird display can be represented graphically such as by audio album covers or video poster images.
- the particular embodiments describe an apparatus and method associated with optimizations to the view of a grid display implementation such that the number of display elements is minimized and independent of the number of items in the full dataset.
- the embodiments also address issues with navigation through the database so that it can be smooth and efficient with respect to the visual perception of the displayed portion.
- the apparatus and method can be particularly adapted for use in a content distribution network encompassing controlled access to a large database of media content.
- an input device such as a motion sensing remote controller
- a touch screen or panel remote device is employed having the cursor on a screen essentially tracking the user's finger or fingers as they move across the remote controller's screen.
- the graphic elements representing content in the database move in response to the user's input where certain graphic elements disappear and new graphic elements appear.
- the touch screen or panel remote device can serve as a display device itself or can simply serve as a navigational tool.
- a conventional hand-held remote controller is employed using at least one button or input mechanism disposed on a surface of the remote controller to navigate the grid display.
- FIG. 1 a block diagram of an embodiment of a system 100 for delivering content to a home or end user is shown.
- the content originates from a content source 102 , such as a movie studio or production house.
- the content can be supplied in at least one of two forms.
- One form can be a broadcast form of content.
- the broadcast content is provided to the broadcast affiliate manager 104 , which is typically a national broadcast service, such as the American Broadcasting Company (ABC), National Broadcasting Company (NBC), Columbia Broadcasting System (CBS), etc.
- the broadcast affiliate manager can collect and store the content, and can schedule delivery of the content over a deliver network, shown as delivery network 1 ( 106 ).
- Delivery network 1 ( 106 ) can include satellite link transmission from a national center to one or more regional or local centers. Delivery network 1 ( 106 ) can also include local content delivery using local delivery systems such as over the air broadcast, satellite broadcast, or cable broadcast. The locally delivered content is provided to a receiving device 108 in a user's home, where the content will subsequently be searched by the user. It is to be appreciated that the receiving device 108 can take many forms and can be embodied as a set top box/digital video recorder (DVR), a gateway, a modem, etc. Further, the receiving device 108 can act as entry point, or gateway, for a home network system that includes additional devices configured as either client or peer devices in the home network.
- DVR set top box/digital video recorder
- the receiving device 108 can act as entry point, or gateway, for a home network system that includes additional devices configured as either client or peer devices in the home network.
- Special content can include content delivered as premium viewing, pay-per-view, or other content otherwise not provided to the broadcast affiliate manager, e.g., movies, video games or other video elements.
- the special content can be content requested by the user.
- the special content can be delivered to a content manager 110 .
- the content manager 110 can be a service provider, such as an Internet website, affiliated, for instance, with a content provider, broadcast service, or delivery network service.
- the content manager 110 can also incorporate Internet content into the delivery system.
- the content manager 110 can deliver the content to the user's receiving device 108 over a separate delivery network, delivery network 2 ( 112 ).
- Delivery network 2 ( 112 ) can include high-speed broadband Internet type communications systems.
- the content from the broadcast affiliate manager 104 can also be delivered using all or parts of delivery network 2 ( 112 ) and content from the content manager 110 can be delivered using all or parts of delivery network 1 ( 106 ).
- the user can also obtain content directly from the Internet via delivery network 2 ( 112 ) without necessarily having the content managed by the content manager 110 .
- the special content is provided as an augmentation to the broadcast content, providing alternative displays, purchase and merchandising options, enhancement material, etc.
- the special content can completely replace some programming content provided as broadcast content.
- the special content can be completely separate from the broadcast content, and can simply be a media alternative that the user can choose to utilize.
- the special content can be a library of movies that are not yet available as broadcast content.
- the receiving device 108 can receive different types of content from one or both of delivery network 1 and delivery network 2 .
- the receiving device 108 processes the content, and provides a separation of the content based on user preferences and commands.
- the receiving device 108 can also include a storage device, such as a hard drive or optical disk drive, for recording and playing back audio and video content. Further details of the operation of the receiving device 108 and features associated with playing back stored content will be described below in relation to FIG. 2 .
- the processed content is provided to a display device 114 .
- the display device 114 can be a conventional 2-D type display or can alternatively be an advanced 3-D display.
- FIG. 2 A block diagram of an embodiment of a receiving device 200 is shown in FIG. 2 .
- Receiving device can operate similar to the receiving device 108 described in FIG. 1 and can be included as part of a gateway device, modem, set top box, or other similar communications device.
- the device 200 shown can also be incorporated into other systems including the display device ( 114 ) itself. In either case, several components necessary for complete operation of the system are not shown in the interest of conciseness, as they are well known to those skilled in the art.
- the content is received in an input signal receiver 202 .
- the input signal receiver 202 can be one of several known receiver circuits used for receiving, demodulation, and decoding signals provided over one of the several possible networks including over the air, cable, satellite, Ethernet, fiber and phone line networks.
- the desired input signal can be selected and retrieved in the input signal receiver 202 based on user input provided through a control interface (not shown).
- the decoded output signal is provided to an input stream processor 204 .
- the input stream processor 204 performs the final signal selection and processing, and includes separation of video content from audio content for the content stream.
- the audio content is provided to an audio processor 206 for conversion from the received format, such as compressed digital signal, to an analog waveform signal.
- the analog waveform signal is provided to an audio interface 208 and further to the display device 114 or an audio amplifier (not shown).
- the audio interface 208 can provide a digital signal to an audio output device or display device using a High-Definition Multimedia Interface (HDMI) cable or alternate audio interface such as via a Sony/Philips Digital Interconnect Format (SPDIF).
- HDMI High-Definition Multimedia Interface
- SPDIF Sony/Philips Digital Interconnect Format
- the audio processor 206 also performs any necessary conversion for the storage of the audio signals.
- the video output from the input stream processor 204 is provided to a video processor 210 .
- the video signal can be one of several formats.
- the video processor 210 provides, as necessary a conversion of the video content, based on the input signal format.
- the video processor 210 also performs any necessary conversion for the storage of the video signals.
- a storage device 212 stores audio and video content received at the input.
- the storage device 212 allows later retrieval and playback of the content under the control of a controller 214 and also based on commands, e.g., navigation instructions such as fast-forward (FF) and rewind (Rew), received from a user interface 216 .
- the storage device 212 can be a hard disk drive, one or more large capacity integrated electronic memories, such as static RAM (SRAM), or dynamic RAM (DRAM), or can be an interchangeable optical disk storage system such as a compact disk (CD) drive or digital video disk (DVD) drive.
- the converted video signal from the video processor 210 , either originating from the input or from the storage device 212 , is provided to the display interface 218 .
- the display interface 218 further provides the display signal to a display device of the type described above.
- the display interface 218 can be an analog signal interface such as red-green-blue (RGB) or can be a digital interface such as HDMI. It is to be appreciated that the display interface 218 will generate the various screens for presenting the search results in a three dimensional gird as will be described in more detail below.
- the controller 214 is interconnected via a bus to several of the components of the device 200 , including the input stream processor 202 , audio processor 206 , video processor 210 , storage device 212 , and a user interface 216 .
- the controller 214 manages the conversion process for converting the input stream signal into a signal for storage on the storage device or for display.
- the controller 214 also manages the retrieval and playback of stored content.
- the controller 214 performs searching of content and the creation and adjusting of the gird display representing the content, either stored or to be delivered via the delivery networks, described above.
- the controller 214 is further coupled to control memory 220 (e.g., volatile or non-volatile memory, including RAM, SRAM, DRAM, ROM, programmable ROM (PROM), flash memory, electronically programmable ROM (EPROM), electronically erasable programmable ROM (EEPROM), etc.) for storing information and instruction code for controller 214 .
- control memory 220 e.g., volatile or non-volatile memory, including RAM, SRAM, DRAM, ROM, programmable ROM (PROM), flash memory, electronically programmable ROM (EPROM), electronically erasable programmable ROM (EEPROM), etc.
- control memory 220 e.g., volatile or non-volatile memory, including RAM, SRAM, DRAM, ROM, programmable ROM (PROM), flash memory, electronically programmable ROM (EPROM), electronically erasable programmable ROM (EEPROM), etc.
- the implementation of the memory can include several possible embodiments, such as a single memory device
- the user interface 216 of the present disclosure employs an input device that moves a cursor around the display.
- a touch panel device 300 can be interfaced to the receiving device 108 , as shown in FIG. 3( a ).
- the touch panel device 300 allows operation of the receiving device or set top box based on hand movements, or gestures, and actions translated through the panel into commands for the set top box.
- the touch panel 300 can simply serve as a navigational tool to navigate the gird display.
- the touch panel 300 will additionally serve as the display device allowing the user to more directly interact with the navigation through the grid display of content.
- a mouse device a remote control with navigation features, or gesture based remote control can also be used, as will be described below.
- the user interface control can be included in the receiving device 200 as part of the user interface 216 or as part of controller 214 .
- the user interface control incorporates features useful for display and navigation through a grid representing content in a database as well for video display of content.
- the user interface, and more specifically the grid user interface element is incorporated into a video media player interface that includes scripting or programming capability for manipulation of graphics.
- the video media player and interface can be implemented in the receiving device 200 using any combination of hardware, software, or firmware.
- some portion of the control and video display operation can be included in the touch panel device 300 and also can be part of the information transmitted across the home the network.
- the input device is a remote controller, with a form of motion detection, such as a gyroscope or accelerometer, which allows the user to move a cursor freely about a screen or display.
- a form of motion detection such as a gyroscope or accelerometer
- FIG. 3( b ) An exemplary hand-held angle-sensing remote controller 301 is illustrated in FIG. 3( b ).
- Remote controller 301 includes a thumb button 302 , positioned on the top side of controller 301 so as to be selectively activated by a user's thumb. Activation of thumb button 302 will also be referred to as a “click,” a command often associated with activation or launch of a selected function.
- Controller 301 further includes a trigger button 304 , positioned on the bottom side of controller 301 so as to be selectively activated by a user's index (or “trigger”) finger.
- Activation of trigger button 304 will also be referred to as a “trigger,” and angular movement (i.e. pitch, yaw and/or roll) of the controller 301 while the trigger is depressed will be referred to as a “trigger-drag.”
- a trigger-drag command is often associated with movement of a cursor, virtual cursor or other indication of the user's interactive position on the display, such as a change of state (i.e., a highlighted or outlined cell), and is commonly used to navigate in and select entries from the interactive display.
- a plurality of buttons 306 are provided for entering numbers and/or letters. In one embodiment, the plurality of buttons 306 is configured similar to a telephone-type keypad.
- a hand-held angle-sensing remote controller such as controller 301 described in FIG. 3( b ) provides for a number of types of user interaction.
- an angle-sensing controller changes in yaw map to left-and-right motions, changes in pitch map to up-and-down motions and changes in roll map to rotational motions along a longitudinal axis of the controller.
- These inputs are used to define gestures and the gestures, in turn, define specific contextual commands.
- a combination of yaw and pitch can be used to define any 2-dimensional motion, such as a diagonal
- a combination of yaw, pitch and roll can be used to define any 3-dimensional motion, such as a swing.
- a number of gestures are illustrated in FIG. 3 . Gestures are interpreted in context and are identified by defined movements of the controller 301 while the trigger button 304 is held (“trigger-drag” movements).
- Bumping 320 is defined by a two-stroke drawing indicating pointing in one direction, either up, down, left or right.
- the bumping gesture is associated with specific commands in context. For example, in a TimeShifting mode, a left-bump gesture 320 indicates rewinding, and a right-bump gesture indicates fast-forwarding. In other contexts, a bump gesture 320 is interpreted to increment a particular value in the direction designated by the bump.
- Checking 330 is defined as in drawing a checkmark. It is similar to a downward bump gesture 320 . Checking is identified in context to designate a reminder, user tag or to select an item or element.
- Circling 340 is defined as drawing a circle in either direction. It is possible that both directions could be distinguished.
- Dragging 350 is defined as an angular movement of the controller (a change in pitch and/or yaw) while holding trigger button 304 (i.e., a “trigger drag”).
- the dragging gesture 350 is used for navigation, speed, distance, time-shifting, rewinding, and forwarding.
- Dragging 350 can be used to move a cursor, a virtual cursor, or a change of state, such as highlighting outlining or selecting on the display. Dragging 350 can be in any direction and is generally used to navigate in two dimensions. However, in certain interfaces, it is preferred to modify the response to the dragging command.
- Nodding 360 is defined by two fast trigger-drag up-and-down vertical movements. Nodding 360 is used to indicate “Yes” or “Accept.”
- X-ing 370 is defined as in drawing the letter “X.” X-ing 370 is used for “Delete” or “Block” commands.
- Wagging 380 is defined by two trigger-drag fast back-and-forth horizontal movements. The wagging gesture 380 is used to indicate “No” or “Cancel.”
- the input device will also include a mechanism to invoke or execute at least three separate options on any element selected on the display or screen. These options will be referred to as “Additional Information”, “Play” and “Additional Search”.
- the “Additional Information”, function is used to display more information about the currently selected element.
- the “Play” function assuming it is available for the selected element, will select that element to be played, which can require a secondary user interface for content purchase, etc.
- the “Additional Search” function represents the mechanism that allows a user to use any element as a source for an additional advanced search that will generate a whole new content set, updating the entire screen based on criteria defined by the selected element. It is to be appreciated that these three options can be associated with predefined or new gestures, for example, on touch panel 300 , or each option can be assigned to a predetermined button, for example, of the plurality of buttons 306 on the remote controller 301 .
- FIG. 4 illustrates a graphical flowchart for the operation of a user interface related to the display and navigational aspects of the grid display of the present disclosure.
- video content 401 from either a broadcast source or the managed special source, can be displayed, in step 402 .
- entering the main menu of the interface can be accomplished by tapping, or otherwise, selecting on the video screen.
- the main menu screen can include a number of user informational elements 403 , and can also include a portion of the display still displaying the previous video content 401 .
- the video content can continue to run, or can be placed in pause mode.
- Navigation into the content library can include using the search, browse, or recommend button by tapping on or otherwise selecting the desired button.
- selection of the search or recommend button accesses a linear display structure 405 of certain objects within the library or database, and include additional criteria for the search or recommendation feature to restrict the coverage size of the database, including actor, genre, or title criteria.
- the linear grid 405 can be more useful for these access functions due to the restrictions placed on the access to the library and the reduction in searchable content.
- selecting the browse function as a navigation tool pulls up a separate two dimensional grid display 407 of content selections.
- the browse function provides access to the complete library or database and promotes very little restriction in navigation around the database.
- the grid display 407 and navigation for the library or database will be described in further detail below.
- An entry or selection of an element from a content display (for instance by tapping on it), after it has been highlighted or enlarged in one of the previous function operations opens a detail screen which provides further detail about the selected content entry, step 410 .
- the detail screen also provides additional options for playing or executing, renting, recording, or buying the content as well as options to return to a previous content navigation function described above.
- FIG. 5 illustrates a detail view of an embodiment of the grid display 500 using aspects of the present disclosure.
- Grid display 500 operates in a manner similar to the operation of grid display 407 described in FIG. 4 .
- the grid display 500 can be displayed on the display device 114 and manipulated or navigated using the touch panel 300 or other navigation device described earlier.
- the interface screen can also be displayed on the touch panel device 300 as a remote display, allowing the user to more directly interact with the navigation through the grid display of content.
- the grid display 500 is made up of several graphical elements 502 arranged in a two dimensional grid.
- the two dimensional grid can include rows and columns or include some other two dimensional pattern arrangement, such as a radial or elliptical pattern around one or more central points. In one embodiment, all of the element move together as on contiguous unit.
- Each graphics element 502 represents a single data entry location from the library or database of content, referred to as the model in the control software, which will be described below in relation to FIG. 7 .
- grid display 500 includes graphic elements representing movie posters.
- Grid displays showing graphic elements representing book covers, album or CD covers, or the like can also be used.
- the current item 504 is, in this case, highlighted by adjusting the appearance of the item, e.g., enlarging and centering the element in the view area.
- additional information can be provided with the graphic element related to the specific content associated to the graphic element.
- the specific content associated with the graphic element can be executed, e.g., a movie to be played, a game to be loaded, a web site to be launched, in response to further user input.
- the grid display takes advantage of screen real-estate or display area to provide additional options and context in multiple dimensions. Navigation of the grid display is not constrained to a single, typically horizontal, dimension.
- Data in the grid e.g., movie content, audio content, etc.
- the data or graphic elements 502 are organized according to at least one variable related to the specific content associated to graphic element. For instance, rows can represent alpha-numeric organization while columns can represent genres.
- Any element 502 can be made up of images representing specific content, such as a defined frame from a piece of recorded content, an image supplied from the network or by the user, or from a library of generic elements either manually or automatically assigned to the content. Any of these elements can be augmented with text, either overlaid on the element itself or displayed along with it, and/or additional smaller elements to indicate the type of content. For example, elements representing content that is locally stored on the receiving device, such as receiving device 108 described in FIG. 1 , can be presented with a small element of a disk drive in the bottom right hand corner of a large image representing the content itself. Elements are configured to be detailed enough for a user to clearly see the type of content they represent.
- Elements could also be in part or fully dynamically created, such as including elements of content currently playing on a broadcast channel.
- an element can be dynamically generated (either locally or delivered from the network) of a scene from recently broadcast video, then combined with a logo or some indication of the channel on which it currently being broadcast. This would allow a user to see at a glance, what is on currently on a large number of channels at once.
- FIG. 6 illustrates the user operation and navigation for the grid display using aspects of the present disclosure. Interaction with the grid display 600 shown in FIG. 6 is described in connection with the touch panel device 300 shown in FIG. 3( a ). Gestures, made by a user's hand, on, or above, the touch or capacitively sensitive panel translate to messages that are communicated to the receiving device 108 via a network, such as a home network, as described in FIG. 1 . The messages are translated by a controller in the touch panel device into changes processed by the set top box or receiving device. It is important to note that the messages creating changes can be interpreted by the receiving device 108 in a manner that results in different effects within the physical data structure representing the content library (known as the model) and the display structure (known as the view) portions of the implementation.
- the content library known as the model
- the view the display structure
- the elements of grid 600 move such that the item in position B (element 602 ) moves off screen of the display device toward the upper right and is replaced by the item in position A (element 604 ), additionally the item in position C (element 606 ) moves to position A, as shown in FIG. 6 .
- movements can be animated on the display for smooth transition.
- Momentum effects can also be applied to enhance the physics of the view. For example, the rate of speed that the gesture is made can be translated into a distance by which the display is shifted through the grid and/or the grid display.
- touch panel interface is only one of possibly multiple input devices that could be used for input into the apparatus or system.
- the use of a hand-held angle-sensing controller, as shown in FIG. 3( b ) provides for a number of types of user interaction.
- FIG. 7 illustrates a state control diagram for the implementation of a grid display using aspects of the present disclosure.
- the implementation for the grid display and the interface follows a Model-View-Controller (MVC) coding structure.
- the Model portion 702 or database library, holds (or provides access to) the entire set of data and also correlates virtual x/y coordinates 704 with specific data items that are arranged in a two dimensional matrix.
- the Model portion 702 also tracks the currently selected item 706 within the dataset based on virtual coordinates (ideally centering the selected item when generated in the view).
- the Controller portion 708 converts mouse and other messages 710 (e.g. from a remote input device) into relative x/y coordinate changes that are submitted to the model which in-turn updates its virtual position 704 .
- the View portion 712 subscribes to events from the Model 702 and generates the grid for a display based on updates.
- the View portion 712 includes position updates and item detail updates.
- the implementation can include a control interpreter for a remote input device such as a touch panel using gesture interaction. Messages from the remote input device are communicated via interface software and/or hardware to the grid display implementation and are interpreted by the Controller 708 for input into the Model 702 .
- a control interpreter for a remote input device such as a touch panel using gesture interaction. Messages from the remote input device are communicated via interface software and/or hardware to the grid display implementation and are interpreted by the Controller 708 for input into the Model 702 .
- a total number of items in a database of graphic elements are determined.
- the data or graphic elements are arranged in a two dimensional array which correlates to a two dimensional grid in virtual space.
- the extent of the virtual space depends on the height and width of the individual grid elements multiplied by the number of rows and columns of data in the data set. Additionally, the dataset need not be arranged symmetrically in the horizontal and vertical dimensions.
- Each data item in the data set contains at least one of an image, title, rating, and uniform resource locator (URL), and other metadata related to a specific piece of content, in the example described above, a feature film.
- URL uniform resource locator
- FIG. 9 illustrates an embodiment of a data structure 900 using aspects of the present disclosure.
- Data structure 900 can include an array of a plurality of data elements arranged based on a pattern for display.
- data structure 900 can include a dataset of 6400 items with the data elements arranged in an array of 80 columns by 80 rows (row and column dimensions need not be identical).
- Data structure 900 illustrates the two dimensional indexing for each data element 902 as it relates between the display area on a display and the array.
- the virtual dimensions for a grid containing 80 ⁇ 80 elements each with a visual dimension of 150 ⁇ 200 pixels would result in a virtual space of 12,000 ⁇ 16,000 pixels.
- the method of the present disclosure would generate only a fraction of the total space, i.e., a first subset of graphic elements, step 804 . This is accomplished by selecting a “window” into the dataset that constitutes the visible area plus an additional border area to facilitate enough caching to support smooth navigation into adjacent areas of the grid, step 806 .
- FIG. 10 illustrates a single element border 1002 around the visible area 1004 .
- all elements 1001 within border 1002 are generated but only the elements 1003 within the area 1004 are visible.
- the border area 1002 could be increased.
- a loading priority for the data can be established.
- the images should be loaded in priority from center outward to the edge of the border 1002 .
- image loading priority is weighted to the elements coming into view over those leaving the view.
- FIG. 10 further illustrates the visible area 1004 , the border area 1002 and the non-generated virtual data space 1006 .
- Generated visual elements are labeled AA (element 1001 ), BA (element 1005 ), etc.
- graphic elements, e.g., element 1007 in the virtual data space 1006 are not generated but are designated to a container or placeholder. As element 1007 enters area 1002 in response to a user input, the container will be loaded and the graphic element associated to same will be loaded or generated.
- step 808 a portion of the first subset of graphic elements will be displayed.
- the elements in the visible area 1004 are generated and visible on the display, while the elements in the pre-generated border area 1002 are generated but not visible on the display.
- the position of at least one graphic element in the first subset is adjusted to a point central on the display in response to a user input and a second subset of graphic elements is displayed with the at least one graphic element at the central point, step 812 . It is to be appreciated that as the elements are moved, elements in the border area 1002 will appear quickly in the visible area 1004 since there are already generated.
- the grid display should be configured to avoid navigating beyond the edge.
- the terminal position for an edge should ideally be centered in the view (and emphasized as the selected item), as illustrated in FIG. 11 .
- element 1 , 1 (element 1101 ) is centered in the visible area 1004 . Even if the user attempts to select element 1 , 1 (element 1101 ) and move in a direction toward the lower right hand corner of the display, the elements will stay in the same position, or be locked. In this manner, the user input will not result in a blank screen.
- FIG. 12 is a flow chart of the optimization process for displaying a portion of a larger database or library in a grid display using aspects of the present disclosure.
- the optimization involves reusing visual elements rather than allocating and de-allocating them as needed.
- step 1202 the method starts and proceeds to position an element based on a change, step 1204 , i.e., in response to a user input.
- step, 1206 it is determined if the element's position exceeds the boundary area 1002 .
- the view element queries the model for data relevant to its new position in virtual space.
- the virtual space location is determined by the current virtual location offset by the actual display coordinates.
- FIG. 13 shows an illustrative embodiment of the movement of the display elements in the grid display following a response to a user input to shift the window of displayed elements with the database or library.
- the display diagram illustrates how visual elements on the top and right side shift to the bottom and left side to fill in as the grid is moved in a diagonal direction.
- element IB 1302 on the right shifts its physical position to IB′ 1304 on the left.
- the visual element queries the model to obtain data relevant to the new virtual position (in the case of IB′ 1304 that would be data element 34 , 22 ).
- the grid display of the present disclosure can be used to browse multiple hundreds or even thousands of items, e.g., content such as movies. Generating a visual element for each element can be processor intensive.
- the techniques of the present disclosure provide for minimizing the number of display elements required for generating a grid display view while maintaining the illusion of navigating a larger virtual space of information.
- the elements can be image objects such as album covers or movie posters.
- the structure arranges the objects in a grid display for taking advantage of the two dimensions of a display, i.e., the vertical dimension along with the horizontal dimension, for navigation purposes.
- the navigation aspects associated with the user interface include gesture based movements translated into display changes for the cover grid. Optimizations to the view of a grid display implementation are described such that the number of display elements is minimized and independent of the number of items in the full dataset and that navigation through the database is smooth and efficient with respect to the visual perception of the displayed portion.
- a new user interface and display system for a video display device with a touch screen makes it possible to peek at, that is, view briefly, a second or favored video content while watching a first video content.
- video from a second video source will be seen to partially displace and temporarily replace a portion of a video presently being viewed.
- Selection of the other video sources can be controlled, for example, by swiping with one, two, three or four fingers or finger tips, and by swiping inwardly from any one of the edges of a video display.
- the video presently being viewed can be interchanged with the video being peeked at.
- Patents, published applications and articles of interest related to aspects of video peeking beyond basic touch screen operation include: U.S. Pat. No. 7,864,163B2, related to displaying at least a portion of a structured electronic document on the touch screen display, wherein the structured electronic document comprises a plurality of boxes of content, and detecting a first gesture at a location on the displayed portion of the structured electronic document so that a first box in the plurality of boxes at the location of the first gesture is determined. The first box on the touch screen display is then enlarged and substantially centered; US2010077433A1, related to a method for displaying program content from a subscription television service on a display and receiving a signal to initiate a multi-panel browsing mode on the display.
- the method includes displaying a multi-panel view on the display, the multi-panel view including a panel with the program content and a panel with top program based on top program information received from the server. Additional panels included in the multi-panel view may include interactive games or other content available from the subscription television service; US2003126605B1 related to an interactive television system designed to populate an electronic program guide (EPG), which provides Video-Clip Previews on Demand by automatically launching a video clip preview, after browsing and navigating through the EPG's grid guide to a highlighted program titled cell, and remaining at such highlighted cell for a predetermined delay.
- EPG electronic program guide
- the display process is a “No-Touch Display” process requiring no selections by the viewer while browsing; and, “I NDIRECT M ULTI -T OUCH I NTERACTION FOR B RUSHING IN P ARALLEL C OORDINATES ”, Kosara, R., Univ N Carolina, VISUALIZATION AND DATA ANALYSIS 2011
- the inventive arrangements for video peeking are, for example, a substantial improvement over picture-in-picture (PIP) functionality because a PIP picture is so much smaller that the PIP picture has less resolution, and because making the PIP picture larger compromises the relative completeness of the primary video source, or main picture.
- Picture-in-picture functionality also does not provide for video from a secondary video source appearing to flow inwardly from the sides of a video display and does not provide automatic disappearance of the video from a secondary video source in an oppositely appearing flow.
- PIP displays must be turned on and turned off by separate actuations of a remote control.
- POP picture-outside-picture
- a user interface in accordance with the inventive arrangement comprises: a touch activatable screen; a touch screen processor capable of detecting swiping motions across areas of the screen, including distinguishing between at least one or more of different directions of the swiping motions, different lengths of the swiping motions and different widths of the swiping motions; a video signal processer for selectively supplying a first video image to the screen and for selectively supplying at least one of a plurality of other video images to the screen; the at least one of the plurality of other video images being selectively supplied to the screen for a given interval of time responsive to a swiping motion across the screen occurring within given ranges of the directions, the lengths and the widths; and, the other video image being supplied to the screen being displayed instead of a portion of the first video image.
- the touch screen processor is preferably capable of detecting swiping motions across areas of the screen, including distinguishing between at least two of different directions of the swiping motions, different lengths of the swiping motions and different widths of the swiping motions.
- the touch screen processor of is also preferably capable of detecting swiping motions across areas of the screen, including distinguishing between each of different directions of the swiping motions, different lengths of the swiping motions and different widths of the swiping motions.
- the other video image is preferably supplied to the screen by displacing the portion of the first video image in a sweeping motion that generally corresponds to the direction of the swiping motion.
- the other video image being supplied to the screen preferably recedes from view in a sweeping motion that generally corresponds to a direction opposite to the direction of the swiping motion.
- the screen of most if not all display devices has discernible sides and each side of the screen is a starting point for at least one, two, three or more of the swiping motions.
- each swiping motion is characterized by at least one or more of a point of origin of the swiping motion, swiping widths, swiping directions, and swiping lengths.
- Various combinations of the characteristics of the swiping motions can preferably result in different ones of the plurality other video images being selectively supplied to the screen. For example, if the screen is generally rectangular, the different combinations of the swiping widths and the swiping directions can provide the other video images selected from any one of at least eight different video sources.
- the maintenance of pressure on the screen at the end of a swiping motion for a given time interval can result in the other video image being substituted for the first video image.
- a swiping motion longer in length than is necessary to initiate displaying one of the other video images on the screen results in the one of the other video images being substituted for the first video image.
- inventive arrangements can also be embodied in a method for controlling a video display having a touch activatable screen, comprising the steps of: displaying a first video image on the screen; detecting a swiping motion across a first area of the screen; distinguishing between different possible origins of the swiping motion, different possible directions of the swiping motion and different possible widths of the swiping motion; selecting at least one of a plurality of other video images responsive to the distinguishing step; supplying the selected video image to the screen for a given interval of time instead of a portion of the first video image; and, upon expiration of the interval of time, terminating the supplying step.
- the inventive arrangements preferably comprise: initiating the supplying step responsive to detecting a first length of the swiping motion occurring within a first range of lengths; and, substituting the selected video image for the first video image responsive to detecting a second length of the swiping motion occurring within a second range of lengths exclusive of the first range.
- the method can comprise the step of substituting the selected video image for the first video image responsive to detecting a user input other than the swiping motion.
- the supplying step further comprises the step of gradually replacing the portion of the first video image by moving the selected video image in a direction that generally corresponds to the direction of the swiping motion
- the terminating step comprises the step of gradually replacing the selected video image with the portion of the first video image by moving the selected video image in a direction that is generally opposite to the direction of the swiping motion.
- the method preferably comprises the step of associating different ones of the plurality of the other video images with different combinations of the different possible origins, directions and widths of the swiping motion.
- FIGS. 1-3 and the accompanying description explain the general operation of a video tablet, in the context of methods and apparatus for grid navigation in the video tablet, including a touch activatable user interface.
- FIGS. 4-7 and the accompanying description explain methods and apparatus for implementing the “video peeking” feature described above.
- FIG. 1 is a block diagram of an exemplary system for delivering video content in accordance with the present disclosure
- FIG. 2 is a block diagram of an exemplary receiving device in accordance with the present disclosure
- FIG. 3( a ) is a perspective view of a touch panel in accordance with the present disclosure
- FIG. 3( b ) includes a perspective view of a wireless hand-held angle-sensing remote controller and illustrates a number of exemplary gestures performed with the remote controller;
- FIG. 4 is a graphical flowchart of operation of an exemplary user interface in accordance with an embodiment of the present disclosure
- FIG. 5 illustrates an exemplary embodiment of a user interface of the present disclosure
- FIG. 6 illustrates user operation and navigation of a user interface in accordance with an exemplary embodiment of the present disclosure
- FIG. 7 illustrates a state control diagram for an exemplary embodiment of a user interface in accordance with the present disclosure
- FIG. 8 is flowchart of an exemplary process for optimizing the user interface in accordance with an embodiment of the present disclosure.
- FIG. 9 illustrates two dimensional indexing for each data element of the user interface
- FIG. 10 illustrates a visible area window and border area of generated graphic elements for a user interface in accordance with an exemplary embodiment of the present disclosure
- FIG. 11 illustrates a view of a user interface in accordance with an exemplary embodiment of the present disclosure
- FIG. 12 is flowchart of an exemplary process for optimizing the user interface in accordance with another embodiment of the present disclosure.
- FIG. 13 illustrates movement of graphic elements in a grid of a user interface in accordance with an exemplary embodiment of the present disclosure.
- FIG. 14 illustrates a video display useful for explaining the user interface of an exemplary embodiment in accordance with the present disclosure
- FIG. 15 illustrates exemplary single and multiple finger swipes of respective widths corresponding to one, two, three and four fingers in accordance with the inventive arrangements
- FIGS. 16( a )- 16 ( d ) illustrate various video display alternatives in accordance with the flowchart shown in FIG. 14 .
- FIGS. 17( a )- 17 ( e ) sequentially illustrate video peeking in accordance with the present disclosure, wherein for purposes of this application, color pictures are depicted in gray scale.
- the inventive arrangements provide a user interface and display system for a video display device that makes it possible to peek at first, and perhaps favored video content while watching other content, referred to herein as video peeking.
- video displays can include, for example and without limitation, video tablets, video pads, smart phones, electronic books, laptop computers, computer monitors and various television apparatus, that are capable of receiving video content from multiple video sources and that can be disposed close enough to a user for touch screen activation and control.
- the video content can also be thought of more generally as primary and secondary videos or video sources; or as a first video source and a plurality of other video sources. Swipe commands on a touch screen interface can, for example, enable a user to peek at video from one of a plurality of other video sources for an interval of time.
- video from one of the other video sources will be seen to partially displace and temporarily replace a portion of the video being viewed with a portion of the other selected video.
- Selection of the other video sources can be controlled, for example, by swiping with one, two, three or four finger tips.
- Selection of the other video sources can be further controlled, for example, by swiping inwardly from any one of the four edges of a rectangular video display.
- a user can easily peek at video from any one of sixteen other or secondary video sources while viewing a video from a primary or first video source, based on how many finger tips are swiped at the same time and based on which side of the video display the swipes originate.
- the touch screen can be further programmed to enable interchanging “peeked at” video from a secondary source with what was previously the video from the primary video source.
- This interchange can, for example and without limitation, be implemented with a longer swipe than needed for a peek. The video from what then becomes a secondary source can thus become swipe-able.
- Video peeking is explained in more detail in conjunction with FIGS. 14-17 .
- a user viewing a first video (or first video content) from a first video source can enable video from one of a plurality of other video sources (or a second video content) to smoothly, partially displace and temporarily replace a portion of the video being viewed with a portion of the other selected video, automatically followed by a smooth disappearance of the other selected video.
- the content of the first video source can also be thought of as a primary video source.
- FIG. 14 is a flow chart 1400 that illustrates the use of swiping as described above to implement and control video peeking at selected video sources in a video tablet or similarly capable device.
- a user selects a plurality of video sources that can be viewed in accordance with the inventive arrangements. These sources of video content can be thought of as “favorite” or “preferred” sources the user expects to view from time to time by video peeking in accordance with the inventive arrangements, while viewing the video content of yet another source altogether.
- Video Source 1 can assign 4 further video sources (2 through 5) to Horizontal Video Peeking, as Video Sources 2 through 5; and can assign 4 further additional video sources to Vertical Video Peeking, as Video Sources 6 through 9.
- Video Source 1 can assign 4 further video sources (2 through 5) to Horizontal Video Peeking, as Video Sources 2 through 5; and can assign 4 further additional video sources to Vertical Video Peeking, as Video Sources 6 through 9.
- Video Source 1 when the user launches the display of Video Source 1 in accordance with step 1404 , and Video Source 1 is displayed in accordance with step 1406 , the user has available a set of eight different channels or sources of video content that can be briefly viewed by video peeking, without completely interrupting the display of Video Source 1.
- a user can then choose to implement horizontal video peeking by utilizing horizontal screen swiping from the left or right edges or sides of the video display in accordance with step 1408 , or by utilizing vertical screen swiping from the top or bottom edges or sides of the video display in accordance with step 1412 .
- the number of other sources of video content are four sources that can be displayed responsive to left or right directional swiping and four sources that can be displayed responsive to up or down directional swiping.
- the number of sources in FIG. 14 is not limiting for purposes of the invention.
- left-only and right-only directional swiping and up-only and down-only directional swiping can invoke any one of sixteen other sources of video content for video peeking.
- the number of other sources of video content will be limited as a practical matter by certain practical and personal issues. One such issue is how many combinations of directions and numbers of fingers can be remembered by any given user. Another such issue is the manual dexterity of the user. It will therefore be appreciated that with respect to flow chart 1400 , selecting fewer than eight other sources of video content is still within the scope of the inventive arrangements. Indeed, even video peeking a single other source of video content is within the scope of the invention.
- FIG. 15 shows a video display device 1502 .
- the display device has an upper edge or side 1504 , a right side or edge 1506 , a lower edge or side 1508 and a left side or edge 1510 .
- Each of the dashed circles 1512 , 1514 , 1516 and 1518 corresponds to a location on the touch activatable display from which a swiping motion can be initiated.
- the circles are not intended to represent a precise size or shape.
- the dashed circles can be displayed as a temporary training measure, can be permanently displayed, or can remain not displayed.
- the arrows are intended to indicate a swiping direction.
- the arrow associated with dashed circle 1512 is intended to indicate a one-finger swipe.
- the arrows associated with dashed circle 1514 are intended to indicate a two-finger swipe.
- the arrows associated with dashed circle 1516 are intended to indicate a three-finger swipe.
- the arrows associated with dashed circle 1518 are intended to indicate a four-finger swipe.
- a one-finger width downward swipe will invoke video peeking of Video Source 2 into Video Source 1.
- a two-finger width swipe to the left will invoke video peeking of Video Source 7 into Video Source 1.
- a three-finger width upward swipe will invoke video peeking of Video Source 4 into Video Source 1.
- a four-finger width swipe to the right will invoke video peeking of Video Source 9 into Video Source 1.
- FIG. 16( a ) illustrates video peeking in a downward direction from the top side or edge.
- a ticker or banner is typically displayed at the bottom of news and weather related video content. Accordingly, it is advantageous for a user to select sources of news and weather related video content to be invoked by downward directed swipes, so that the ticker or banner is fully visible.
- the scores and other status information is often displayed at the top of the video content or in the upper left corner of the video content.
- a user familiar with the display practices of the broadcasts of the user's favorite teams can also select swiping options that immediately provide information regarding the status of the sports events.
- FIGS. 17( a ) through 17 ( e ) are a sequence 1700 of video frames 1702 , 1704 , 1706 , 1708 , and 1710 extracted from a video clip to visually illustrate video peeking.
- the content of the first video source (Video Source 1 or the primary video or primary video source) is denoted by 1712 .
- Video Source 1 in this example, a penguin, is displayed in frame 1702 .
- a circle 1716 and an arrow 1718 are also shown in video frame 1702 .
- part of the primary video 1712 can be seen as being partially displaced and temporarily replaced by a portion of other video content 1714 , in this example, a baseball game, which moves in from the right side.
- frame 1706 the displacement and replacement of a portion of primary video 1714 is complete.
- video frame 1708 the left edge of the other video 1714 can be seen moving to the right and getting smaller.
- frame 1710 only primary video 1712 is being displayed once again.
- video peeking can last for an interval of time, and it is contemplated that the time interval can be adjustable.
- a user can hold or maintain video peeking by continuing to press on the touch screen at the end of the swipe, the video peek thus continuing until the user stops pressing the screen.
- a user peeking at new video who wants to view the peeked at video in full screen can, for example, do so by swiping across the entire screen or by pressing at the end of a swiping motion.
- the extent to which a secondary video will displace and replace the primary video can be controlled by the length of the swipe.
- a user can, for example and without limitation, advantageously view sports, news, or entertainment channels, and can peek at other videos to check scores and breaking news, or even determine if a commercial in an entertainment video has ended, and so return to a primary video source.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Devices For Indicating Variable Information By Combining Individual Elements (AREA)
Abstract
Description
- This application claims under 35 U.S.C. §119(e) the benefit of the filing date of Provisional Patent Application No. 61/515,578, filed 5 Aug. 2011, and incorporates by reference the subject matter of Provisional Patent Application No. 61/515,578, filed 5 Aug. 2011. Insofar as the background of the invention is concerned, this application is related to commonly owned PCT/US2010/049772, having an International Filing Date of 22 Sep. 2010.
- The background of the invention relates to the general operation of a touch activatable screen in a video tablet or other display device. The description of a touch activatable screen in a video tablet or other display device found herein can also be found in PCT/US2010/049772. Patents, published applications and articles cited in the initial International Search Report for PCT/US2010/049772, and thus relating to the general operation of a touch activatable screen include: US2009210819A1; US20070013708A1; US20080079972A1; US20090153478A1; US2009019924A1; EP1450277A2; and SHIRAZI J: “Java Performance Tuning—
Chapter 4 Object Creation”. It should be understood that the elements shown in the figures can be implemented in various forms of hardware, software or combinations thereof. Preferably, these elements are implemented in a combination of hardware and software on one or more appropriately programmed general-purpose devices, which can include a processor, memory and input/output interfaces. Herein, the phrase “coupled” is defined to mean directly connected to or indirectly connected with through one or more intermediate components. Such intermediate components can include both hardware and software based components. - The present description illustrates the principles of the present disclosure. It will thus be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described or shown herein, embody the principles of the disclosure and are included within its spirit and scope.
- All examples and conditional language recited herein are intended for educational purposes to aid the reader in understanding the principles of the disclosure and the concepts contributed by the inventor to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions.
- Moreover, all statements herein reciting principles, aspects, and embodiments of the disclosure, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future, i.e., any elements developed that perform the same function, regardless of structure.
- Thus, for example, it will be appreciated by those skilled in the art that the block diagrams presented herein represent conceptual views of illustrative circuitry embodying the principles of the disclosure. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo-code, and the like represent various processes which can be substantially represented in computer readable media and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
- The functions of the various elements shown in the figures can be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software. When provided by a processor, the functions can be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which can be shared. Moreover, explicit use of the term “processor” or “controller” should not be construed to refer exclusively to hardware capable of executing software, and can implicitly include, without limitation, digital signal processor (DSP) hardware, read only memory (ROM) for storing software, random access memory (RAM), and nonvolatile storage.
- Other hardware, conventional and/or custom, can also be included. Similarly, any switches shown in the figures are conceptual only. Their function can be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the implementer as more specifically understood from the context.
- In the claims hereof, any element expressed as a means for performing a specified function is intended to encompass any way of performing that function including, for example, a) a combination of circuit elements that performs that function or b) software in any form, including, therefore, firmware, microcode or the like, combined with appropriate circuitry for executing that software to perform the function. The disclosure as defined by such claims resides in the fact that the functionalities provided by the various recited means are combined and brought together in the manner which the claims call for. It is thus regarded that any means that can provide those functionalities are equivalent to those shown herein.
- The present disclosure provides for a grid display which is a graphical user interface view that enables a user to navigate a set of data elements in two dimensional space (i.e., x and y directions). The gird display can have a two dimensional (2D) pattern, e.g., columns and rows, but can take other forms. Navigation of the grid display can be accomplished using commends, such as gestures, to locate the desired element. An entry in the grid display is tapped or otherwise selected to initiate further action, e.g., executing or playing associated content. This interface mechanism is for use in media applications where items on the gird display can be represented graphically such as by audio album covers or video poster images. The particular embodiments describe an apparatus and method associated with optimizations to the view of a grid display implementation such that the number of display elements is minimized and independent of the number of items in the full dataset. The embodiments also address issues with navigation through the database so that it can be smooth and efficient with respect to the visual perception of the displayed portion. The apparatus and method can be particularly adapted for use in a content distribution network encompassing controlled access to a large database of media content.
- Navigation through the user interface of the present disclosure is facilitated by a mechanism to move quickly, simply and accurately across a display, such as a television, monitor, or touch screen. In one embodiment, an input device such as a motion sensing remote controller is provided. In another embodiment, a touch screen or panel remote device is employed having the cursor on a screen essentially tracking the user's finger or fingers as they move across the remote controller's screen. As the user passes over the grid display, the graphic elements representing content in the database move in response to the user's input where certain graphic elements disappear and new graphic elements appear. In the touch screen or panel remote device embodiment, it is to be appreciated that the touch screen or panel remote device can serve as a display device itself or can simply serve as a navigational tool. In a further embodiment, a conventional hand-held remote controller is employed using at least one button or input mechanism disposed on a surface of the remote controller to navigate the grid display.
- Initially, systems for delivering various types of content to a user will be described. Subsequently, a method and user interface for searching the content in accordance with embodiments of the present disclosure will then be detailed.
- Turning now to
FIG. 1 , a block diagram of an embodiment of asystem 100 for delivering content to a home or end user is shown. The content originates from acontent source 102, such as a movie studio or production house. The content can be supplied in at least one of two forms. One form can be a broadcast form of content. The broadcast content is provided to thebroadcast affiliate manager 104, which is typically a national broadcast service, such as the American Broadcasting Company (ABC), National Broadcasting Company (NBC), Columbia Broadcasting System (CBS), etc. The broadcast affiliate manager can collect and store the content, and can schedule delivery of the content over a deliver network, shown as delivery network 1 (106). Delivery network 1 (106) can include satellite link transmission from a national center to one or more regional or local centers. Delivery network 1 (106) can also include local content delivery using local delivery systems such as over the air broadcast, satellite broadcast, or cable broadcast. The locally delivered content is provided to a receivingdevice 108 in a user's home, where the content will subsequently be searched by the user. It is to be appreciated that thereceiving device 108 can take many forms and can be embodied as a set top box/digital video recorder (DVR), a gateway, a modem, etc. Further, thereceiving device 108 can act as entry point, or gateway, for a home network system that includes additional devices configured as either client or peer devices in the home network. - A second form of content is referred to as special content. Special content can include content delivered as premium viewing, pay-per-view, or other content otherwise not provided to the broadcast affiliate manager, e.g., movies, video games or other video elements. In many cases, the special content can be content requested by the user. The special content can be delivered to a
content manager 110. Thecontent manager 110 can be a service provider, such as an Internet website, affiliated, for instance, with a content provider, broadcast service, or delivery network service. Thecontent manager 110 can also incorporate Internet content into the delivery system. Thecontent manager 110 can deliver the content to the user's receivingdevice 108 over a separate delivery network, delivery network 2 (112). Delivery network 2 (112) can include high-speed broadband Internet type communications systems. It is important to note that the content from thebroadcast affiliate manager 104 can also be delivered using all or parts of delivery network 2 (112) and content from thecontent manager 110 can be delivered using all or parts of delivery network 1 (106). In addition, the user can also obtain content directly from the Internet via delivery network 2 (112) without necessarily having the content managed by thecontent manager 110. - Several adaptations for utilizing the separately delivered content can be possible. In one possible approach, the special content is provided as an augmentation to the broadcast content, providing alternative displays, purchase and merchandising options, enhancement material, etc. In another embodiment, the special content can completely replace some programming content provided as broadcast content. Finally, the special content can be completely separate from the broadcast content, and can simply be a media alternative that the user can choose to utilize. For instance, the special content can be a library of movies that are not yet available as broadcast content.
- The receiving
device 108 can receive different types of content from one or both ofdelivery network 1 anddelivery network 2. The receivingdevice 108 processes the content, and provides a separation of the content based on user preferences and commands. The receivingdevice 108 can also include a storage device, such as a hard drive or optical disk drive, for recording and playing back audio and video content. Further details of the operation of the receivingdevice 108 and features associated with playing back stored content will be described below in relation toFIG. 2 . The processed content is provided to adisplay device 114. Thedisplay device 114 can be a conventional 2-D type display or can alternatively be an advanced 3-D display. - A block diagram of an embodiment of a receiving
device 200 is shown inFIG. 2 . Receiving device can operate similar to the receivingdevice 108 described inFIG. 1 and can be included as part of a gateway device, modem, set top box, or other similar communications device. Thedevice 200 shown can also be incorporated into other systems including the display device (114) itself. In either case, several components necessary for complete operation of the system are not shown in the interest of conciseness, as they are well known to those skilled in the art. - In the
device 200, the content is received in aninput signal receiver 202. Theinput signal receiver 202 can be one of several known receiver circuits used for receiving, demodulation, and decoding signals provided over one of the several possible networks including over the air, cable, satellite, Ethernet, fiber and phone line networks. The desired input signal can be selected and retrieved in theinput signal receiver 202 based on user input provided through a control interface (not shown). The decoded output signal is provided to aninput stream processor 204. Theinput stream processor 204 performs the final signal selection and processing, and includes separation of video content from audio content for the content stream. The audio content is provided to anaudio processor 206 for conversion from the received format, such as compressed digital signal, to an analog waveform signal. The analog waveform signal is provided to anaudio interface 208 and further to thedisplay device 114 or an audio amplifier (not shown). Alternatively, theaudio interface 208 can provide a digital signal to an audio output device or display device using a High-Definition Multimedia Interface (HDMI) cable or alternate audio interface such as via a Sony/Philips Digital Interconnect Format (SPDIF). Theaudio processor 206 also performs any necessary conversion for the storage of the audio signals. - The video output from the
input stream processor 204 is provided to avideo processor 210. The video signal can be one of several formats. Thevideo processor 210 provides, as necessary a conversion of the video content, based on the input signal format. Thevideo processor 210 also performs any necessary conversion for the storage of the video signals. - A
storage device 212 stores audio and video content received at the input. Thestorage device 212 allows later retrieval and playback of the content under the control of acontroller 214 and also based on commands, e.g., navigation instructions such as fast-forward (FF) and rewind (Rew), received from auser interface 216. Thestorage device 212 can be a hard disk drive, one or more large capacity integrated electronic memories, such as static RAM (SRAM), or dynamic RAM (DRAM), or can be an interchangeable optical disk storage system such as a compact disk (CD) drive or digital video disk (DVD) drive. - The converted video signal, from the
video processor 210, either originating from the input or from thestorage device 212, is provided to thedisplay interface 218. Thedisplay interface 218 further provides the display signal to a display device of the type described above. Thedisplay interface 218 can be an analog signal interface such as red-green-blue (RGB) or can be a digital interface such as HDMI. It is to be appreciated that thedisplay interface 218 will generate the various screens for presenting the search results in a three dimensional gird as will be described in more detail below. - The
controller 214 is interconnected via a bus to several of the components of thedevice 200, including theinput stream processor 202,audio processor 206,video processor 210,storage device 212, and auser interface 216. Thecontroller 214 manages the conversion process for converting the input stream signal into a signal for storage on the storage device or for display. Thecontroller 214 also manages the retrieval and playback of stored content. Furthermore, as will be described below, thecontroller 214 performs searching of content and the creation and adjusting of the gird display representing the content, either stored or to be delivered via the delivery networks, described above. Thecontroller 214 is further coupled to control memory 220 (e.g., volatile or non-volatile memory, including RAM, SRAM, DRAM, ROM, programmable ROM (PROM), flash memory, electronically programmable ROM (EPROM), electronically erasable programmable ROM (EEPROM), etc.) for storing information and instruction code forcontroller 214. Further, the implementation of the memory can include several possible embodiments, such as a single memory device or, alternatively, more than one memory circuit communicatively connected or coupled together to form a shared or common memory. Still further, the memory can be included with other circuitry, such as portions of bus communications circuitry, in a larger circuit. - To operate effectively, the
user interface 216 of the present disclosure employs an input device that moves a cursor around the display. To further enhance the user experience and to facilitate the display of, and navigation around, a database such as a movie library, a touch panel device 300 can be interfaced to the receivingdevice 108, as shown inFIG. 3( a). The touch panel device 300 allows operation of the receiving device or set top box based on hand movements, or gestures, and actions translated through the panel into commands for the set top box. In one embodiment, the touch panel 300 can simply serve as a navigational tool to navigate the gird display. In other embodiments, the touch panel 300 will additionally serve as the display device allowing the user to more directly interact with the navigation through the grid display of content. - Alternatively, a mouse device, a remote control with navigation features, or gesture based remote control can also be used, as will be described below.
- The user interface control can be included in the receiving
device 200 as part of theuser interface 216 or as part ofcontroller 214. The user interface control incorporates features useful for display and navigation through a grid representing content in a database as well for video display of content. The user interface, and more specifically the grid user interface element, is incorporated into a video media player interface that includes scripting or programming capability for manipulation of graphics. The video media player and interface can be implemented in the receivingdevice 200 using any combination of hardware, software, or firmware. Alternatively, some portion of the control and video display operation can be included in the touch panel device 300 and also can be part of the information transmitted across the home the network. - In another embodiment, the input device is a remote controller, with a form of motion detection, such as a gyroscope or accelerometer, which allows the user to move a cursor freely about a screen or display. An exemplary hand-held angle-sensing remote controller 301 is illustrated in
FIG. 3( b). Remote controller 301 includes a thumb button 302, positioned on the top side of controller 301 so as to be selectively activated by a user's thumb. Activation of thumb button 302 will also be referred to as a “click,” a command often associated with activation or launch of a selected function. Controller 301 further includes a trigger button 304, positioned on the bottom side of controller 301 so as to be selectively activated by a user's index (or “trigger”) finger. Activation of trigger button 304 will also be referred to as a “trigger,” and angular movement (i.e. pitch, yaw and/or roll) of the controller 301 while the trigger is depressed will be referred to as a “trigger-drag.” A trigger-drag command is often associated with movement of a cursor, virtual cursor or other indication of the user's interactive position on the display, such as a change of state (i.e., a highlighted or outlined cell), and is commonly used to navigate in and select entries from the interactive display. Additionally, a plurality of buttons 306 are provided for entering numbers and/or letters. In one embodiment, the plurality of buttons 306 is configured similar to a telephone-type keypad. - The use of a hand-held angle-sensing remote controller, such as controller 301 described in
FIG. 3( b), provides for a number of types of user interaction. When using an angle-sensing controller, changes in yaw map to left-and-right motions, changes in pitch map to up-and-down motions and changes in roll map to rotational motions along a longitudinal axis of the controller. These inputs are used to define gestures and the gestures, in turn, define specific contextual commands. As such, a combination of yaw and pitch can be used to define any 2-dimensional motion, such as a diagonal, and a combination of yaw, pitch and roll can be used to define any 3-dimensional motion, such as a swing. A number of gestures are illustrated inFIG. 3 . Gestures are interpreted in context and are identified by defined movements of the controller 301 while the trigger button 304 is held (“trigger-drag” movements). - Bumping 320 is defined by a two-stroke drawing indicating pointing in one direction, either up, down, left or right. The bumping gesture is associated with specific commands in context. For example, in a TimeShifting mode, a left-
bump gesture 320 indicates rewinding, and a right-bump gesture indicates fast-forwarding. In other contexts, abump gesture 320 is interpreted to increment a particular value in the direction designated by the bump. Checking 330 is defined as in drawing a checkmark. It is similar to adownward bump gesture 320. Checking is identified in context to designate a reminder, user tag or to select an item or element. Circling 340 is defined as drawing a circle in either direction. It is possible that both directions could be distinguished. However, to avoid confusion, a circle is identified as a single command regardless of direction. Dragging 350 is defined as an angular movement of the controller (a change in pitch and/or yaw) while holding trigger button 304 (i.e., a “trigger drag”). The dragginggesture 350 is used for navigation, speed, distance, time-shifting, rewinding, and forwarding. Dragging 350 can be used to move a cursor, a virtual cursor, or a change of state, such as highlighting outlining or selecting on the display. Dragging 350 can be in any direction and is generally used to navigate in two dimensions. However, in certain interfaces, it is preferred to modify the response to the dragging command. For example, in some interfaces, operation in one dimension or direction is favored with respect to other dimensions or directions depending upon the position of the virtual cursor or the direction of movement. Nodding 360 is defined by two fast trigger-drag up-and-down vertical movements. Nodding 360 is used to indicate “Yes” or “Accept.” X-ing 370 is defined as in drawing the letter “X.” X-ing 370 is used for “Delete” or “Block” commands. Wagging 380 is defined by two trigger-drag fast back-and-forth horizontal movements. The wagginggesture 380 is used to indicate “No” or “Cancel.” - In addition to traditional controls for video playback, the input device will also include a mechanism to invoke or execute at least three separate options on any element selected on the display or screen. These options will be referred to as “Additional Information”, “Play” and “Additional Search”. The “Additional Information”, function is used to display more information about the currently selected element. The “Play” function, assuming it is available for the selected element, will select that element to be played, which can require a secondary user interface for content purchase, etc. The “Additional Search” function represents the mechanism that allows a user to use any element as a source for an additional advanced search that will generate a whole new content set, updating the entire screen based on criteria defined by the selected element. It is to be appreciated that these three options can be associated with predefined or new gestures, for example, on touch panel 300, or each option can be assigned to a predetermined button, for example, of the plurality of buttons 306 on the remote controller 301.
- It is to be appreciated that at least some of the components described above in relation to
FIGS. 1-3 will form an apparatus and/or system for generating the user interface. -
FIG. 4 illustrates a graphical flowchart for the operation of a user interface related to the display and navigational aspects of the grid display of the present disclosure. Initially,video content 401, from either a broadcast source or the managed special source, can be displayed, instep 402. Instep 404, entering the main menu of the interface can be accomplished by tapping, or otherwise, selecting on the video screen. The main menu screen can include a number of userinformational elements 403, and can also include a portion of the display still displaying theprevious video content 401. The video content can continue to run, or can be placed in pause mode. - Navigation into the content library can include using the search, browse, or recommend button by tapping on or otherwise selecting the desired button. In
step 406, selection of the search or recommend button accesses alinear display structure 405 of certain objects within the library or database, and include additional criteria for the search or recommendation feature to restrict the coverage size of the database, including actor, genre, or title criteria. Thelinear grid 405 can be more useful for these access functions due to the restrictions placed on the access to the library and the reduction in searchable content. - In
step 408, selecting the browse function as a navigation tool pulls up a separate twodimensional grid display 407 of content selections. The browse function provides access to the complete library or database and promotes very little restriction in navigation around the database. Thegrid display 407 and navigation for the library or database will be described in further detail below. An entry or selection of an element from a content display (for instance by tapping on it), after it has been highlighted or enlarged in one of the previous function operations opens a detail screen which provides further detail about the selected content entry,step 410. The detail screen also provides additional options for playing or executing, renting, recording, or buying the content as well as options to return to a previous content navigation function described above. -
FIG. 5 illustrates a detail view of an embodiment of thegrid display 500 using aspects of the present disclosure.Grid display 500 operates in a manner similar to the operation ofgrid display 407 described inFIG. 4 . Thegrid display 500 can be displayed on thedisplay device 114 and manipulated or navigated using the touch panel 300 or other navigation device described earlier. The interface screen can also be displayed on the touch panel device 300 as a remote display, allowing the user to more directly interact with the navigation through the grid display of content. - The
grid display 500 is made up of severalgraphical elements 502 arranged in a two dimensional grid. The two dimensional grid can include rows and columns or include some other two dimensional pattern arrangement, such as a radial or elliptical pattern around one or more central points. In one embodiment, all of the element move together as on contiguous unit. Eachgraphics element 502 represents a single data entry location from the library or database of content, referred to as the model in the control software, which will be described below in relation toFIG. 7 . For example,grid display 500 includes graphic elements representing movie posters. Grid displays showing graphic elements representing book covers, album or CD covers, or the like can also be used. Thecurrent item 504 is, in this case, highlighted by adjusting the appearance of the item, e.g., enlarging and centering the element in the view area. When an item is highlighted in response to a user input, additional information can be provided with the graphic element related to the specific content associated to the graphic element. Furthermore, the specific content associated with the graphic element can be executed, e.g., a movie to be played, a game to be loaded, a web site to be launched, in response to further user input. - The grid display takes advantage of screen real-estate or display area to provide additional options and context in multiple dimensions. Navigation of the grid display is not constrained to a single, typically horizontal, dimension. Data in the grid, e.g., movie content, audio content, etc., can be either arbitrarily or explicitly organized within the two dimensional space. When explicitly organized, the data or
graphic elements 502 are organized according to at least one variable related to the specific content associated to graphic element. For instance, rows can represent alpha-numeric organization while columns can represent genres. - Any
element 502 can be made up of images representing specific content, such as a defined frame from a piece of recorded content, an image supplied from the network or by the user, or from a library of generic elements either manually or automatically assigned to the content. Any of these elements can be augmented with text, either overlaid on the element itself or displayed along with it, and/or additional smaller elements to indicate the type of content. For example, elements representing content that is locally stored on the receiving device, such as receivingdevice 108 described inFIG. 1 , can be presented with a small element of a disk drive in the bottom right hand corner of a large image representing the content itself. Elements are configured to be detailed enough for a user to clearly see the type of content they represent. Elements could also be in part or fully dynamically created, such as including elements of content currently playing on a broadcast channel. For example, an element can be dynamically generated (either locally or delivered from the network) of a scene from recently broadcast video, then combined with a logo or some indication of the channel on which it currently being broadcast. This would allow a user to see at a glance, what is on currently on a large number of channels at once. -
FIG. 6 illustrates the user operation and navigation for the grid display using aspects of the present disclosure. Interaction with thegrid display 600 shown inFIG. 6 is described in connection with the touch panel device 300 shown inFIG. 3( a). Gestures, made by a user's hand, on, or above, the touch or capacitively sensitive panel translate to messages that are communicated to the receivingdevice 108 via a network, such as a home network, as described inFIG. 1 . The messages are translated by a controller in the touch panel device into changes processed by the set top box or receiving device. It is important to note that the messages creating changes can be interpreted by the receivingdevice 108 in a manner that results in different effects within the physical data structure representing the content library (known as the model) and the display structure (known as the view) portions of the implementation. - As an example, when initiating a drag motion from lower left to upper right on the touch panel, as shown in
FIG. 3( a), the elements ofgrid 600 move such that the item in position B (element 602) moves off screen of the display device toward the upper right and is replaced by the item in position A (element 604), additionally the item in position C (element 606) moves to position A, as shown inFIG. 6 . - Additionally, movements can be animated on the display for smooth transition. Momentum effects can also be applied to enhance the physics of the view. For example, the rate of speed that the gesture is made can be translated into a distance by which the display is shifted through the grid and/or the grid display.
- It is important to note that the touch panel interface is only one of possibly multiple input devices that could be used for input into the apparatus or system. For instance, the use of a hand-held angle-sensing controller, as shown in
FIG. 3( b), provides for a number of types of user interaction. -
FIG. 7 illustrates a state control diagram for the implementation of a grid display using aspects of the present disclosure. The implementation for the grid display and the interface follows a Model-View-Controller (MVC) coding structure. TheModel portion 702, or database library, holds (or provides access to) the entire set of data and also correlates virtual x/y coordinates 704 with specific data items that are arranged in a two dimensional matrix. TheModel portion 702 also tracks the currently selecteditem 706 within the dataset based on virtual coordinates (ideally centering the selected item when generated in the view). TheController portion 708 converts mouse and other messages 710 (e.g. from a remote input device) into relative x/y coordinate changes that are submitted to the model which in-turn updates itsvirtual position 704. TheView portion 712 subscribes to events from theModel 702 and generates the grid for a display based on updates. TheView portion 712 includes position updates and item detail updates. - Additionally, the implementation can include a control interpreter for a remote input device such as a touch panel using gesture interaction. Messages from the remote input device are communicated via interface software and/or hardware to the grid display implementation and are interpreted by the
Controller 708 for input into theModel 702. - A method for optimizing the display of the grid display and interface will be described in relation to
FIGS. 8-13 . - Initially, in
step 802, a total number of items in a database of graphic elements are determined. The data or graphic elements are arranged in a two dimensional array which correlates to a two dimensional grid in virtual space. The extent of the virtual space depends on the height and width of the individual grid elements multiplied by the number of rows and columns of data in the data set. Additionally, the dataset need not be arranged symmetrically in the horizontal and vertical dimensions. Each data item in the data set contains at least one of an image, title, rating, and uniform resource locator (URL), and other metadata related to a specific piece of content, in the example described above, a feature film. -
FIG. 9 illustrates an embodiment of adata structure 900 using aspects of the present disclosure.Data structure 900 can include an array of a plurality of data elements arranged based on a pattern for display. For example,data structure 900 can include a dataset of 6400 items with the data elements arranged in an array of 80 columns by 80 rows (row and column dimensions need not be identical). -
Data structure 900 illustrates the two dimensional indexing for eachdata element 902 as it relates between the display area on a display and the array. As an exemplary illustration of the embodiment, the virtual dimensions for a grid containing 80×80 elements each with a visual dimension of 150×200 pixels would result in a virtual space of 12,000×16,000 pixels. Rather than load images for all 6400 items in a plane 12,000×16,000 pixels, the method of the present disclosure would generate only a fraction of the total space, i.e., a first subset of graphic elements,step 804. This is accomplished by selecting a “window” into the dataset that constitutes the visible area plus an additional border area to facilitate enough caching to support smooth navigation into adjacent areas of the grid,step 806.FIG. 10 illustrates asingle element border 1002 around thevisible area 1004. As will be described in more detail below, allelements 1001 withinborder 1002 are generated but only theelements 1003 within thearea 1004 are visible. In cases where it is desirable to catch a full screen's worth of information in any direction, to support a quick gestural move across an entire screen width, theborder area 1002 could be increased. - A loading priority for the data (such as images) can be established. The images should be loaded in priority from center outward to the edge of the
border 1002. As direction of movement is known image loading priority is weighted to the elements coming into view over those leaving the view.FIG. 10 further illustrates thevisible area 1004, theborder area 1002 and the non-generatedvirtual data space 1006. Generated visual elements are labeled AA (element 1001), BA (element 1005), etc. It is to be appreciated that graphic elements, e.g.,element 1007, in thevirtual data space 1006 are not generated but are designated to a container or placeholder. Aselement 1007 entersarea 1002 in response to a user input, the container will be loaded and the graphic element associated to same will be loaded or generated. - In
step 808, a portion of the first subset of graphic elements will be displayed. As shown inFIG. 10 , the elements in thevisible area 1004 are generated and visible on the display, while the elements in thepre-generated border area 1002 are generated but not visible on the display. Instep 810, the position of at least one graphic element in the first subset is adjusted to a point central on the display in response to a user input and a second subset of graphic elements is displayed with the at least one graphic element at the central point,step 812. It is to be appreciated that as the elements are moved, elements in theborder area 1002 will appear quickly in thevisible area 1004 since there are already generated. For example, if a user selected element CE (37, 25) and dragged the element to the upper left, at least element GC (41, 23) would move into theborder area 1002 and out of the visible area. Additionally, at least element AG (35, 27) will move fromborder area 1002 tovisible area 1004, the transition appearing seamless since element AG was already generated and cache in the border area. - Special exceptions can be handled, such as in the case where an edge or corner data is approached. The grid display should be configured to avoid navigating beyond the edge. The terminal position for an edge should ideally be centered in the view (and emphasized as the selected item), as illustrated in
FIG. 11 . Here,element 1, 1 (element 1101) is centered in thevisible area 1004. Even if the user attempts to selectelement 1, 1 (element 1101) and move in a direction toward the lower right hand corner of the display, the elements will stay in the same position, or be locked. In this manner, the user input will not result in a blank screen. -
FIG. 12 is a flow chart of the optimization process for displaying a portion of a larger database or library in a grid display using aspects of the present disclosure. The optimization involves reusing visual elements rather than allocating and de-allocating them as needed. Instep 1202, the method starts and proceeds to position an element based on a change,step 1204, i.e., in response to a user input. In step, 1206, it is determined if the element's position exceeds theboundary area 1002. When a display element moves outside thepre-generated border area 1002, the element is moved to the opposite edge of the border area,step 1208. Instep 1210, the view element queries the model for data relevant to its new position in virtual space. The virtual space location is determined by the current virtual location offset by the actual display coordinates. -
FIG. 13 shows an illustrative embodiment of the movement of the display elements in the grid display following a response to a user input to shift the window of displayed elements with the database or library. The display diagram illustrates how visual elements on the top and right side shift to the bottom and left side to fill in as the grid is moved in a diagonal direction. For example,element IB 1302 on the right shifts its physical position to IB′ 1304 on the left. As part of this transition the visual element queries the model to obtain data relevant to the new virtual position (in the case of IB′ 1304 that would be data element 34, 22). In the case ofvisual element IA 1306 there are actually two moves, one from right to left resulting in IA′ 1308 and another from the top to the bottom resulting in final location of IA″ 1310. In this manner, when an element moves into theborder area 1002, it is moved to another position in the border area so the underlying container or placeholder for the data or graphic element does not need to be unloaded. The container will be re-used but only the graphic element would need to be loaded which is less resource intensive than having to load a new container. - The grid display of the present disclosure can be used to browse multiple hundreds or even thousands of items, e.g., content such as movies. Generating a visual element for each element can be processor intensive. The techniques of the present disclosure provide for minimizing the number of display elements required for generating a grid display view while maintaining the illusion of navigating a larger virtual space of information.
- An embodiment of software code that can be used to operate a video display with a touch activatable screen as described above can be found in Provisional Patent Application No. 61/515,578 and in PCT/US2010/049772. The software code represents a means for enabling and implementing the various inventive features taught herein, the aforesaid various inventive features not having been known at the time PCT/US2010/049772 was filed. The software code is exemplary, and it is understood by those skilled in the art that other software code can be developed for implementing the inventive features taught herein. Accordingly, the noted software code is deemed to be understood by those skilled in the art and does not need to be repeated herein.
- Described thus far, is an apparatus and method for displaying and navigating through a database or library of elements representing available content. The elements can be image objects such as album covers or movie posters. The structure arranges the objects in a grid display for taking advantage of the two dimensions of a display, i.e., the vertical dimension along with the horizontal dimension, for navigation purposes. The navigation aspects associated with the user interface include gesture based movements translated into display changes for the cover grid. Optimizations to the view of a grid display implementation are described such that the number of display elements is minimized and independent of the number of items in the full dataset and that navigation through the database is smooth and efficient with respect to the visual perception of the displayed portion.
- Having described presently preferred embodiments of an apparatus, method and user interface for grid navigation, which are intended to be illustrative and not limiting, it is noted that modifications and variations can be made by persons skilled in the art in light of the above teachings.
- A new user interface and display system for a video display device with a touch screen makes it possible to peek at, that is, view briefly, a second or favored video content while watching a first video content. During a video peek, video from a second video source will be seen to partially displace and temporarily replace a portion of a video presently being viewed. Selection of the other video sources can be controlled, for example, by swiping with one, two, three or four fingers or finger tips, and by swiping inwardly from any one of the edges of a video display. Moreover, the video presently being viewed can be interchanged with the video being peeked at.
- Patents, published applications and articles of interest related to aspects of video peeking beyond basic touch screen operation include: U.S. Pat. No. 7,864,163B2, related to displaying at least a portion of a structured electronic document on the touch screen display, wherein the structured electronic document comprises a plurality of boxes of content, and detecting a first gesture at a location on the displayed portion of the structured electronic document so that a first box in the plurality of boxes at the location of the first gesture is determined. The first box on the touch screen display is then enlarged and substantially centered; US2010077433A1, related to a method for displaying program content from a subscription television service on a display and receiving a signal to initiate a multi-panel browsing mode on the display. The method includes displaying a multi-panel view on the display, the multi-panel view including a panel with the program content and a panel with top program based on top program information received from the server. Additional panels included in the multi-panel view may include interactive games or other content available from the subscription television service; US2003126605B1 related to an interactive television system designed to populate an electronic program guide (EPG), which provides Video-Clip Previews on Demand by automatically launching a video clip preview, after browsing and navigating through the EPG's grid guide to a highlighted program titled cell, and remaining at such highlighted cell for a predetermined delay. The display process is a “No-Touch Display” process requiring no selections by the viewer while browsing; and, “I
NDIRECT MULTI -TOUCH INTERACTION FOR BRUSHING IN PARALLEL COORDINATES ”, Kosara, R., Univ N Carolina, VISUALIZATION AND DATA ANALYSIS 2011|7868:-2011, SPIE-INT SOC OPTICAL ENGINEERING related to the use of multi-touch interaction to provide fast and convenient interaction with parallel coordinates by using a multi-touch trackpad rather than the screen directly so the user's hands do not obscure the visualization during interaction, wherein the user employs one, two, three, or four fingers on the trackpad to perform complex selections in a data set. - The inventive arrangements for video peeking are, for example, a substantial improvement over picture-in-picture (PIP) functionality because a PIP picture is so much smaller that the PIP picture has less resolution, and because making the PIP picture larger compromises the relative completeness of the primary video source, or main picture. Picture-in-picture functionality also does not provide for video from a secondary video source appearing to flow inwardly from the sides of a video display and does not provide automatic disappearance of the video from a secondary video source in an oppositely appearing flow. On the contrary, PIP displays must be turned on and turned off by separate actuations of a remote control. The same deficiencies are generally true for picture-outside-picture (POP) functionality.
- A user interface in accordance with the inventive arrangement comprises: a touch activatable screen; a touch screen processor capable of detecting swiping motions across areas of the screen, including distinguishing between at least one or more of different directions of the swiping motions, different lengths of the swiping motions and different widths of the swiping motions; a video signal processer for selectively supplying a first video image to the screen and for selectively supplying at least one of a plurality of other video images to the screen; the at least one of the plurality of other video images being selectively supplied to the screen for a given interval of time responsive to a swiping motion across the screen occurring within given ranges of the directions, the lengths and the widths; and, the other video image being supplied to the screen being displayed instead of a portion of the first video image.
- The touch screen processor is preferably capable of detecting swiping motions across areas of the screen, including distinguishing between at least two of different directions of the swiping motions, different lengths of the swiping motions and different widths of the swiping motions. The touch screen processor of is also preferably capable of detecting swiping motions across areas of the screen, including distinguishing between each of different directions of the swiping motions, different lengths of the swiping motions and different widths of the swiping motions.
- The other video image is preferably supplied to the screen by displacing the portion of the first video image in a sweeping motion that generally corresponds to the direction of the swiping motion. At the end of the interval of time, the other video image being supplied to the screen preferably recedes from view in a sweeping motion that generally corresponds to a direction opposite to the direction of the swiping motion.
- The screen of most if not all display devices has discernible sides and each side of the screen is a starting point for at least one, two, three or more of the swiping motions.
- In a presently preferred embodiment taught herein, each swiping motion is characterized by at least one or more of a point of origin of the swiping motion, swiping widths, swiping directions, and swiping lengths. Various combinations of the characteristics of the swiping motions can preferably result in different ones of the plurality other video images being selectively supplied to the screen. For example, if the screen is generally rectangular, the different combinations of the swiping widths and the swiping directions can provide the other video images selected from any one of at least eight different video sources.
- Further control can be provided in accordance with the inventive arrangements. For example, the maintenance of pressure on the screen at the end of a swiping motion for a given time interval can result in the other video image being substituted for the first video image. Alternatively, a swiping motion longer in length than is necessary to initiate displaying one of the other video images on the screen results in the one of the other video images being substituted for the first video image.
- The inventive arrangements can also be embodied in a method for controlling a video display having a touch activatable screen, comprising the steps of: displaying a first video image on the screen; detecting a swiping motion across a first area of the screen; distinguishing between different possible origins of the swiping motion, different possible directions of the swiping motion and different possible widths of the swiping motion; selecting at least one of a plurality of other video images responsive to the distinguishing step; supplying the selected video image to the screen for a given interval of time instead of a portion of the first video image; and, upon expiration of the interval of time, terminating the supplying step.
- With regard to distinguishing at least one range of length of the swiping motion, the inventive arrangements preferably comprise: initiating the supplying step responsive to detecting a first length of the swiping motion occurring within a first range of lengths; and, substituting the selected video image for the first video image responsive to detecting a second length of the swiping motion occurring within a second range of lengths exclusive of the first range. Alternatively, the method can comprise the step of substituting the selected video image for the first video image responsive to detecting a user input other than the swiping motion.
- In a presently preferred embodiment, the supplying step further comprises the step of gradually replacing the portion of the first video image by moving the selected video image in a direction that generally corresponds to the direction of the swiping motion, whereas the terminating step comprises the step of gradually replacing the selected video image with the portion of the first video image by moving the selected video image in a direction that is generally opposite to the direction of the swiping motion. In general, the method preferably comprises the step of associating different ones of the plurality of the other video images with different combinations of the different possible origins, directions and widths of the swiping motion.
-
FIGS. 1-3 and the accompanying description explain the general operation of a video tablet, in the context of methods and apparatus for grid navigation in the video tablet, including a touch activatable user interface. -
FIGS. 4-7 and the accompanying description explain methods and apparatus for implementing the “video peeking” feature described above. - In the drawings, wherein like reference numerals denote similar elements throughout the views:
-
FIG. 1 is a block diagram of an exemplary system for delivering video content in accordance with the present disclosure; -
FIG. 2 is a block diagram of an exemplary receiving device in accordance with the present disclosure; -
FIG. 3( a) is a perspective view of a touch panel in accordance with the present disclosure; -
FIG. 3( b) includes a perspective view of a wireless hand-held angle-sensing remote controller and illustrates a number of exemplary gestures performed with the remote controller; -
FIG. 4 is a graphical flowchart of operation of an exemplary user interface in accordance with an embodiment of the present disclosure; -
FIG. 5 illustrates an exemplary embodiment of a user interface of the present disclosure; -
FIG. 6 illustrates user operation and navigation of a user interface in accordance with an exemplary embodiment of the present disclosure; -
FIG. 7 illustrates a state control diagram for an exemplary embodiment of a user interface in accordance with the present disclosure; -
FIG. 8 is flowchart of an exemplary process for optimizing the user interface in accordance with an embodiment of the present disclosure. -
FIG. 9 illustrates two dimensional indexing for each data element of the user interface; -
FIG. 10 illustrates a visible area window and border area of generated graphic elements for a user interface in accordance with an exemplary embodiment of the present disclosure; -
FIG. 11 illustrates a view of a user interface in accordance with an exemplary embodiment of the present disclosure; -
FIG. 12 is flowchart of an exemplary process for optimizing the user interface in accordance with another embodiment of the present disclosure; and -
FIG. 13 illustrates movement of graphic elements in a grid of a user interface in accordance with an exemplary embodiment of the present disclosure. -
FIG. 14 illustrates a video display useful for explaining the user interface of an exemplary embodiment in accordance with the present disclosure; -
FIG. 15 illustrates exemplary single and multiple finger swipes of respective widths corresponding to one, two, three and four fingers in accordance with the inventive arrangements; -
FIGS. 16( a)-16(d) illustrate various video display alternatives in accordance with the flowchart shown inFIG. 14 . -
FIGS. 17( a)-17(e) sequentially illustrate video peeking in accordance with the present disclosure, wherein for purposes of this application, color pictures are depicted in gray scale. - It should be understood that the drawings are for purposes of illustrating the concepts of the disclosure and do not necessarily represent the only possible configuration for illustrating and implementing the disclosure.
- The inventive arrangements provide a user interface and display system for a video display device that makes it possible to peek at first, and perhaps favored video content while watching other content, referred to herein as video peeking. Such video displays can include, for example and without limitation, video tablets, video pads, smart phones, electronic books, laptop computers, computer monitors and various television apparatus, that are capable of receiving video content from multiple video sources and that can be disposed close enough to a user for touch screen activation and control. The video content can also be thought of more generally as primary and secondary videos or video sources; or as a first video source and a plurality of other video sources. Swipe commands on a touch screen interface can, for example, enable a user to peek at video from one of a plurality of other video sources for an interval of time.
- In accordance with the inventive arrangements, video from one of the other video sources will be seen to partially displace and temporarily replace a portion of the video being viewed with a portion of the other selected video. Selection of the other video sources can be controlled, for example, by swiping with one, two, three or four finger tips. Selection of the other video sources can be further controlled, for example, by swiping inwardly from any one of the four edges of a rectangular video display. In this manner, for example, a user can easily peek at video from any one of sixteen other or secondary video sources while viewing a video from a primary or first video source, based on how many finger tips are swiped at the same time and based on which side of the video display the swipes originate. Moreover, the touch screen can be further programmed to enable interchanging “peeked at” video from a secondary source with what was previously the video from the primary video source. This interchange can, for example and without limitation, be implemented with a longer swipe than needed for a peek. The video from what then becomes a secondary source can thus become swipe-able.
- Video peeking is explained in more detail in conjunction with
FIGS. 14-17 . In a fundamental embodiment of video peeking, a user viewing a first video (or first video content) from a first video source, can enable video from one of a plurality of other video sources (or a second video content) to smoothly, partially displace and temporarily replace a portion of the video being viewed with a portion of the other selected video, automatically followed by a smooth disappearance of the other selected video. The content of the first video source can also be thought of as a primary video source. -
FIG. 14 is aflow chart 1400 that illustrates the use of swiping as described above to implement and control video peeking at selected video sources in a video tablet or similarly capable device. At step 1402 a user selects a plurality of video sources that can be viewed in accordance with the inventive arrangements. These sources of video content can be thought of as “favorite” or “preferred” sources the user expects to view from time to time by video peeking in accordance with the inventive arrangements, while viewing the video content of yet another source altogether. As noted in the flow chart the viewer can assign a first video source denoted asVideo Source 1; can assign 4 further video sources (2 through 5) to Horizontal Video Peeking, asVideo Sources 2 through 5; and can assign 4 further additional video sources to Vertical Video Peeking, asVideo Sources 6 through 9. Thus, when the user launches the display ofVideo Source 1 in accordance withstep 1404, andVideo Source 1 is displayed in accordance withstep 1406, the user has available a set of eight different channels or sources of video content that can be briefly viewed by video peeking, without completely interrupting the display ofVideo Source 1. A user can then choose to implement horizontal video peeking by utilizing horizontal screen swiping from the left or right edges or sides of the video display in accordance withstep 1408, or by utilizing vertical screen swiping from the top or bottom edges or sides of the video display in accordance withstep 1412. - As indicated by the blocks associated with
reference numeral 1410, horizontally swiping one, two, three or four fingers will initiate a video peek at video content orvideo sources reference numeral 1414, swiping one, two, three or four fingers vertically will initiate a video peek at video content orvideo sources - In the embodiment shown in
FIG. 14 the number of other sources of video content are four sources that can be displayed responsive to left or right directional swiping and four sources that can be displayed responsive to up or down directional swiping. The number of sources inFIG. 14 is not limiting for purposes of the invention. In fact, for example, left-only and right-only directional swiping and up-only and down-only directional swiping can invoke any one of sixteen other sources of video content for video peeking. However, the number of other sources of video content will be limited as a practical matter by certain practical and personal issues. One such issue is how many combinations of directions and numbers of fingers can be remembered by any given user. Another such issue is the manual dexterity of the user. It will therefore be appreciated that with respect toflow chart 1400, selecting fewer than eight other sources of video content is still within the scope of the inventive arrangements. Indeed, even video peeking a single other source of video content is within the scope of the invention. -
FIG. 15 shows avideo display device 1502. The display device has an upper edge orside 1504, a right side oredge 1506, a lower edge orside 1508 and a left side oredge 1510. Each of the dashedcircles circle 1512 is intended to indicate a one-finger swipe. The arrows associated with dashedcircle 1514 are intended to indicate a two-finger swipe. The arrows associated with dashedcircle 1516 are intended to indicate a three-finger swipe. The arrows associated with dashedcircle 1518 are intended to indicate a four-finger swipe. - The practical application of the
flowchart 1400 inFIG. 14 is illustrated inFIGS. 16( a) through 16(d). With reference toflow chart 1400 inFIG. 14 , andFIG. 16( a), a one-finger width downward swipe will invoke video peeking ofVideo Source 2 intoVideo Source 1. With reference toflow chart 1400 inFIG. 14 , andFIG. 16( b), a two-finger width swipe to the left will invoke video peeking ofVideo Source 7 intoVideo Source 1. With reference toflow chart 1400 inFIG. 14 , andFIG. 16( c), a three-finger width upward swipe will invoke video peeking ofVideo Source 4 intoVideo Source 1. With reference toflow chart 1400 inFIG. 14 , andFIG. 16( d), a four-finger width swipe to the right will invoke video peeking ofVideo Source 9 intoVideo Source 1. - It should be noted that the association of a swiping width and swiping direction can be made in a way that can optimize the value of video peeking. For example,
FIG. 16( a) illustrates video peeking in a downward direction from the top side or edge. It is known that a ticker or banner is typically displayed at the bottom of news and weather related video content. Accordingly, it is advantageous for a user to select sources of news and weather related video content to be invoked by downward directed swipes, so that the ticker or banner is fully visible. Similarly, in sources of video content for sports events, the scores and other status information is often displayed at the top of the video content or in the upper left corner of the video content. A user familiar with the display practices of the broadcasts of the user's favorite teams can also select swiping options that immediately provide information regarding the status of the sports events. -
FIGS. 17( a) through 17(e), are asequence 1700 ofvideo frames Video Source 1 or the primary video or primary video source) is denoted by 1712. OnlyVideo Source 1, in this example, a penguin, is displayed inframe 1702. A circle 1716 and an arrow 1718, as explained in connection withFIGS. 15 and 16( a)-16(d), are also shown invideo frame 1702. Inframe 1704, and responsive to a swiping motion in the direction of arrow 1716, part of theprimary video 1712 can be seen as being partially displaced and temporarily replaced by a portion ofother video content 1714, in this example, a baseball game, which moves in from the right side. Inframe 1706, the displacement and replacement of a portion ofprimary video 1714 is complete. Invideo frame 1708 the left edge of theother video 1714 can be seen moving to the right and getting smaller. Inframe 1710, onlyprimary video 1712 is being displayed once again. - In accordance with presently preferred embodiments video peeking can last for an interval of time, and it is contemplated that the time interval can be adjustable. In further accordance with the presently preferred embodiments, for example, a user can hold or maintain video peeking by continuing to press on the touch screen at the end of the swipe, the video peek thus continuing until the user stops pressing the screen. In further accordance with the presently preferred embodiments, a user peeking at new video who wants to view the peeked at video in full screen can, for example, do so by swiping across the entire screen or by pressing at the end of a swiping motion. In this regard, it is also contemplated that the extent to which a secondary video will displace and replace the primary video can be controlled by the length of the swipe.
- In accordance with the inventive arrangements, a user can, for example and without limitation, advantageously view sports, news, or entertainment channels, and can peek at other videos to check scores and breaking news, or even determine if a commercial in an entertainment video has ended, and so return to a primary video source.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/236,105 US9152235B2 (en) | 2011-08-05 | 2012-02-21 | Video peeking |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201161515578P | 2011-08-05 | 2011-08-05 | |
PCT/US2012/025878 WO2013022486A1 (en) | 2011-08-05 | 2012-02-21 | Video peeking |
US14/236,105 US9152235B2 (en) | 2011-08-05 | 2012-02-21 | Video peeking |
Publications (2)
Publication Number | Publication Date |
---|---|
US20140176479A1 true US20140176479A1 (en) | 2014-06-26 |
US9152235B2 US9152235B2 (en) | 2015-10-06 |
Family
ID=45809665
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/236,105 Active 2032-03-01 US9152235B2 (en) | 2011-08-05 | 2012-02-21 | Video peeking |
Country Status (7)
Country | Link |
---|---|
US (1) | US9152235B2 (en) |
EP (1) | EP2740264B1 (en) |
JP (1) | JP6050352B2 (en) |
KR (1) | KR20140044881A (en) |
CN (1) | CN103797784A (en) |
BR (1) | BR112014002039B1 (en) |
WO (1) | WO2013022486A1 (en) |
Cited By (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140298245A1 (en) * | 2013-03-28 | 2014-10-02 | Microsoft Corporation | Display Instance Management |
US20140368736A1 (en) * | 2013-06-17 | 2014-12-18 | Sporify AB | System and method for selecting media to be preloaded for adjacent channels |
US20150100885A1 (en) * | 2013-10-04 | 2015-04-09 | Morgan James Riley | Video streaming on a mobile device |
US20150128046A1 (en) * | 2013-11-07 | 2015-05-07 | Cisco Technology, Inc. | Interactive contextual panels for navigating a content stream |
US20150143238A1 (en) * | 2013-11-15 | 2015-05-21 | Lg Electronics Inc. | Mobile terminal and method of controlling the same |
US20150160853A1 (en) * | 2013-12-05 | 2015-06-11 | Naver Corporation | Video transition method and video transition system |
US9063640B2 (en) | 2013-10-17 | 2015-06-23 | Spotify Ab | System and method for switching between media items in a plurality of sequences of media items |
US9229918B2 (en) | 2010-12-23 | 2016-01-05 | Microsoft Technology Licensing, Llc | Presenting an application change through a tile |
US20160065881A1 (en) * | 2014-09-03 | 2016-03-03 | Samsung Electronics Co., Ltd. | Display apparatus and method of controlling the same |
USD765139S1 (en) * | 2013-12-24 | 2016-08-30 | Tencent Technology (Shenzhen) Company Limited | Portion of a display screen with graphical user interface |
USD770325S1 (en) * | 2013-12-24 | 2016-11-01 | Tencent Technology (Shenzhen) Company Limited | Penguin figurine |
US9516082B2 (en) | 2013-08-01 | 2016-12-06 | Spotify Ab | System and method for advancing to a predefined portion of a decompressed media stream |
US20160370957A1 (en) * | 2015-06-18 | 2016-12-22 | Apple Inc. | Device, Method, and Graphical User Interface for Navigating Media Content |
US9529888B2 (en) | 2013-09-23 | 2016-12-27 | Spotify Ab | System and method for efficiently providing media and associated metadata |
US9582157B1 (en) * | 2012-08-03 | 2017-02-28 | I4VU1, Inc. | User interface and program guide for a multi-program video viewing apparatus |
US9654532B2 (en) | 2013-09-23 | 2017-05-16 | Spotify Ab | System and method for sharing file portions between peers with different capabilities |
US20170188087A1 (en) * | 2014-06-11 | 2017-06-29 | Sumsung Electronics Co., Ltd. | User terminal, method for controlling same, and multimedia system |
US20170371510A1 (en) * | 2016-06-28 | 2017-12-28 | Fuji Xerox Co., Ltd. | Information processing apparatus, information processing system, and image forming apparatus |
US9928029B2 (en) | 2015-09-08 | 2018-03-27 | Apple Inc. | Device, method, and graphical user interface for providing audiovisual feedback |
US9990113B2 (en) | 2015-09-08 | 2018-06-05 | Apple Inc. | Devices, methods, and graphical user interfaces for moving a current focus using a touch-sensitive remote control |
EP3358851A1 (en) * | 2017-02-01 | 2018-08-08 | OpenTV, Inc. | Menu modification based on controller manipulation data |
USD826959S1 (en) * | 2016-05-27 | 2018-08-28 | Axis Ab | Display screen or portion thereof with graphical user interface |
USD831700S1 (en) * | 2017-07-31 | 2018-10-23 | Shenzhen Valuelink E-Commerce Co., Ltd. | Display screen or portion thereof with graphical user interface |
USD833473S1 (en) * | 2015-09-24 | 2018-11-13 | 4Thought Sa | Display screen or portion thereof with graphical user interface |
US10222935B2 (en) | 2014-04-23 | 2019-03-05 | Cisco Technology Inc. | Treemap-type user interface |
US10353566B2 (en) | 2011-09-09 | 2019-07-16 | Microsoft Technology Licensing, Llc | Semantic zoom animations |
US10372520B2 (en) | 2016-11-22 | 2019-08-06 | Cisco Technology, Inc. | Graphical user interface for visualizing a plurality of issues with an infrastructure |
US10614616B1 (en) | 2017-09-26 | 2020-04-07 | Amazon Technologies, Inc. | Virtual reality user interface generation |
US10739943B2 (en) | 2016-12-13 | 2020-08-11 | Cisco Technology, Inc. | Ordered list user interface |
USD896235S1 (en) | 2017-09-26 | 2020-09-15 | Amazon Technologies, Inc. | Display system with a virtual reality graphical user interface |
US10862867B2 (en) | 2018-04-01 | 2020-12-08 | Cisco Technology, Inc. | Intelligent graphical user interface |
US20210014570A1 (en) * | 2018-03-28 | 2021-01-14 | Huawei Technologies Co., Ltd. | Video Preview Method and Electronic Device |
EP3754476A4 (en) * | 2018-03-01 | 2021-03-31 | Huawei Technologies Co., Ltd. | Information display method, graphical user interface and terminal |
US10969944B2 (en) | 2010-12-23 | 2021-04-06 | Microsoft Technology Licensing, Llc | Application reporting in an application-selectable user interface |
USD916860S1 (en) | 2017-09-26 | 2021-04-20 | Amazon Technologies, Inc. | Display system with a virtual reality graphical user interface |
US20210343316A1 (en) * | 2014-07-23 | 2021-11-04 | Gopro, Inc. | Scene and activity identification in video summary generation |
US11388455B2 (en) * | 2016-06-02 | 2022-07-12 | Multimo, Llc | Method and apparatus for morphing multiple video streams into single video stream |
US11922006B2 (en) | 2018-06-03 | 2024-03-05 | Apple Inc. | Media control for screensavers on an electronic device |
Families Citing this family (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9423878B2 (en) * | 2011-01-06 | 2016-08-23 | Blackberry Limited | Electronic device and method of displaying information in response to a gesture |
US9465440B2 (en) | 2011-01-06 | 2016-10-11 | Blackberry Limited | Electronic device and method of displaying information in response to a gesture |
US9471145B2 (en) | 2011-01-06 | 2016-10-18 | Blackberry Limited | Electronic device and method of displaying information in response to a gesture |
WO2013169849A2 (en) | 2012-05-09 | 2013-11-14 | Industries Llc Yknots | Device, method, and graphical user interface for displaying user interface objects corresponding to an application |
CN105260049B (en) | 2012-05-09 | 2018-10-23 | 苹果公司 | For contacting the equipment for carrying out display additional information, method and graphic user interface in response to user |
WO2013169843A1 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for manipulating framed graphical objects |
KR101683868B1 (en) | 2012-05-09 | 2016-12-07 | 애플 인크. | Device, method, and graphical user interface for transitioning between display states in response to gesture |
WO2013169854A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object |
JP6082458B2 (en) | 2012-05-09 | 2017-02-15 | アップル インコーポレイテッド | Device, method, and graphical user interface for providing tactile feedback of actions performed within a user interface |
EP3410287B1 (en) | 2012-05-09 | 2022-08-17 | Apple Inc. | Device, method, and graphical user interface for selecting user interface objects |
EP2847661A2 (en) | 2012-05-09 | 2015-03-18 | Apple Inc. | Device, method, and graphical user interface for moving and dropping a user interface object |
WO2013169842A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for selecting object within a group of objects |
WO2013169865A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input |
WO2013169851A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for facilitating user interaction with controls in a user interface |
KR101515623B1 (en) * | 2012-05-14 | 2015-04-28 | 삼성전자주식회사 | Method and apparatus for operating functions of portable terminal having bended display |
JP6097843B2 (en) | 2012-12-29 | 2017-03-15 | アップル インコーポレイテッド | Device, method and graphical user interface for determining whether to scroll or select content |
CN105144057B (en) | 2012-12-29 | 2019-05-17 | 苹果公司 | For moving the equipment, method and graphic user interface of cursor according to the cosmetic variation of the control icon with simulation three-dimensional feature |
KR101905174B1 (en) | 2012-12-29 | 2018-10-08 | 애플 인크. | Device, method, and graphical user interface for navigating user interface hierachies |
JP6093877B2 (en) | 2012-12-29 | 2017-03-08 | アップル インコーポレイテッド | Device, method, and graphical user interface for foregoing generation of tactile output for multi-touch gestures |
US8994828B2 (en) | 2013-02-28 | 2015-03-31 | Apple Inc. | Aligned video comparison tool |
KR101799294B1 (en) * | 2013-05-10 | 2017-11-20 | 삼성전자주식회사 | Display appratus and Method for controlling display apparatus thereof |
KR102134404B1 (en) * | 2013-08-27 | 2020-07-16 | 삼성전자주식회사 | Method for displaying data and an electronic device thereof |
USD770492S1 (en) * | 2014-08-22 | 2016-11-01 | Google Inc. | Portion of a display panel with a computer icon |
CN105446608A (en) * | 2014-09-25 | 2016-03-30 | 阿里巴巴集团控股有限公司 | Information searching method, information searching device and electronic device |
US10048757B2 (en) | 2015-03-08 | 2018-08-14 | Apple Inc. | Devices and methods for controlling media presentation |
US9632664B2 (en) | 2015-03-08 | 2017-04-25 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US9645732B2 (en) | 2015-03-08 | 2017-05-09 | Apple Inc. | Devices, methods, and graphical user interfaces for displaying and using menus |
US10095396B2 (en) | 2015-03-08 | 2018-10-09 | Apple Inc. | Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object |
US9639184B2 (en) | 2015-03-19 | 2017-05-02 | Apple Inc. | Touch input cursor manipulation |
US20170045981A1 (en) | 2015-08-10 | 2017-02-16 | Apple Inc. | Devices and Methods for Processing Touch Inputs Based on Their Intensities |
US9860451B2 (en) | 2015-06-07 | 2018-01-02 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US9891811B2 (en) | 2015-06-07 | 2018-02-13 | Apple Inc. | Devices and methods for navigating between user interfaces |
US10346030B2 (en) | 2015-06-07 | 2019-07-09 | Apple Inc. | Devices and methods for navigating between user interfaces |
US9830048B2 (en) | 2015-06-07 | 2017-11-28 | Apple Inc. | Devices and methods for processing touch inputs with instructions in a web page |
US10200598B2 (en) | 2015-06-07 | 2019-02-05 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
JP6643776B2 (en) * | 2015-06-11 | 2020-02-12 | 株式会社バンダイナムコエンターテインメント | Terminal device and program |
US10416800B2 (en) | 2015-08-10 | 2019-09-17 | Apple Inc. | Devices, methods, and graphical user interfaces for adjusting user interface objects |
US10235035B2 (en) | 2015-08-10 | 2019-03-19 | Apple Inc. | Devices, methods, and graphical user interfaces for content navigation and manipulation |
US9880735B2 (en) | 2015-08-10 | 2018-01-30 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
CN106791352A (en) * | 2015-11-25 | 2017-05-31 | 中兴通讯股份有限公司 | A kind of photographic method, device and terminal |
JP2017126941A (en) * | 2016-01-15 | 2017-07-20 | 株式会社サイバーエージェント | Content distribution system |
US10397632B2 (en) * | 2016-02-16 | 2019-08-27 | Google Llc | Touch gesture control of video playback |
KR20180020830A (en) * | 2016-08-19 | 2018-02-28 | 삼성전자주식회사 | Videowall system, the control method and a display apparatus |
CN109640188B (en) * | 2018-12-28 | 2020-02-07 | 北京微播视界科技有限公司 | Video preview method and device, electronic equipment and computer readable storage medium |
JP6826295B1 (en) * | 2019-09-20 | 2021-02-03 | 株式会社ミクシィ | Computer programs, information processing equipment and information processing methods |
JP7332919B2 (en) * | 2019-09-20 | 2023-08-24 | 株式会社Mixi | Computer program, information processing device and information processing method |
JP6865343B1 (en) * | 2020-02-17 | 2021-04-28 | 株式会社インフォシティ | Information communication terminal device and display control method in the device |
JP2022126543A (en) * | 2021-02-18 | 2022-08-30 | 富士フイルム株式会社 | Information processing device and program |
Family Cites Families (40)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS60124196U (en) | 1984-01-30 | 1985-08-21 | 並木精密宝石株式会社 | Pick-up cartridge generator coil |
JPH11103470A (en) * | 1997-09-26 | 1999-04-13 | Toshiba Corp | Video changeover processing device |
US20030126605A1 (en) | 2001-12-28 | 2003-07-03 | Betz Steve Craig | Method for displaying EPG video-clip previews on demand |
JP3925297B2 (en) * | 2002-05-13 | 2007-06-06 | ソニー株式会社 | Video display system and video display control device |
GB0303888D0 (en) | 2003-02-19 | 2003-03-26 | Sec Dep Acting Through Ordnanc | Image streaming |
US7975531B2 (en) | 2005-03-18 | 2011-07-12 | Nanyang Technological University | Microfluidic sensor for interfacial tension measurement and method for measuring interfacial tension |
US9041744B2 (en) | 2005-07-14 | 2015-05-26 | Telecommunication Systems, Inc. | Tiled map display on a wireless device |
US7532253B1 (en) * | 2005-07-26 | 2009-05-12 | Pixelworks, Inc. | Television channel change picture-in-picture circuit and method |
JP2007334525A (en) * | 2006-06-14 | 2007-12-27 | Sofny Group:Kk | Computer, client/server computer group, server computer, display program, and display representation method |
US7864163B2 (en) | 2006-09-06 | 2011-01-04 | Apple Inc. | Portable electronic device, method, and graphical user interface for displaying structured electronic documents |
US8564543B2 (en) * | 2006-09-11 | 2013-10-22 | Apple Inc. | Media player with imaged based browsing |
US7667719B2 (en) | 2006-09-29 | 2010-02-23 | Amazon Technologies, Inc. | Image-based document display |
JP4973245B2 (en) * | 2007-03-08 | 2012-07-11 | 富士ゼロックス株式会社 | Display device and program |
US8194037B2 (en) | 2007-12-14 | 2012-06-05 | Apple Inc. | Centering a 3D remote controller in a media system |
US8250604B2 (en) | 2008-02-05 | 2012-08-21 | Sony Corporation | Near real-time multiple thumbnail guide with single tuner |
JP5039903B2 (en) | 2008-02-18 | 2012-10-03 | インターナショナル・ビジネス・マシーンズ・コーポレーション | System, method and program for executing application |
JP5016553B2 (en) * | 2008-05-28 | 2012-09-05 | 京セラ株式会社 | Mobile communication terminal and terminal operation method |
US20090328101A1 (en) | 2008-06-30 | 2009-12-31 | Nokia Corporation | User interface for mobile tv interactive services |
KR101526973B1 (en) * | 2008-07-07 | 2015-06-11 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
US20100077433A1 (en) | 2008-09-24 | 2010-03-25 | Verizon Data Services Llc | Multi-panel television browsing |
KR101588660B1 (en) * | 2008-09-30 | 2016-01-28 | 삼성전자주식회사 | A display apparatus capable of moving image and the method thereof |
JP4666053B2 (en) | 2008-10-28 | 2011-04-06 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
JP5202425B2 (en) * | 2009-04-27 | 2013-06-05 | 三菱電機株式会社 | Video surveillance system |
JP5179537B2 (en) * | 2010-04-09 | 2013-04-10 | 株式会社ソニー・コンピュータエンタテインメント | Information processing device |
JP5541998B2 (en) * | 2010-07-28 | 2014-07-09 | 株式会社ソニー・コンピュータエンタテインメント | Information processing device |
JP2012038271A (en) * | 2010-08-11 | 2012-02-23 | Kyocera Corp | Electronic apparatus and method for controlling the same |
US20120069055A1 (en) * | 2010-09-22 | 2012-03-22 | Nikon Corporation | Image display apparatus |
JP5678576B2 (en) * | 2010-10-27 | 2015-03-04 | ソニー株式会社 | Information processing apparatus, information processing method, program, and monitoring system |
US9471145B2 (en) * | 2011-01-06 | 2016-10-18 | Blackberry Limited | Electronic device and method of displaying information in response to a gesture |
US9423878B2 (en) * | 2011-01-06 | 2016-08-23 | Blackberry Limited | Electronic device and method of displaying information in response to a gesture |
US20120262462A1 (en) * | 2011-04-18 | 2012-10-18 | Johan Montan | Portable electronic device for displaying images and method of operation thereof |
US20130141371A1 (en) * | 2011-12-01 | 2013-06-06 | Research In Motion Limited | Electronic device and method of displaying information in response to a gesture |
DE112012005665B4 (en) * | 2012-01-12 | 2022-11-03 | Mitsubishi Electric Corporation | Map display device and map display method |
KR20130090138A (en) * | 2012-02-03 | 2013-08-13 | 삼성전자주식회사 | Operation method for plural touch panel and portable device supporting the same |
JP5882779B2 (en) * | 2012-02-15 | 2016-03-09 | キヤノン株式会社 | Image processing apparatus, image processing apparatus control method, and program |
JP5598737B2 (en) * | 2012-02-27 | 2014-10-01 | カシオ計算機株式会社 | Image display device, image display method, and image display program |
CN102929527A (en) * | 2012-09-27 | 2013-02-13 | 鸿富锦精密工业(深圳)有限公司 | Device with picture switching function and picture switching method |
CN103902080A (en) * | 2012-12-27 | 2014-07-02 | 华硕电脑股份有限公司 | Touch device and touch processing method |
KR102010955B1 (en) * | 2013-01-07 | 2019-08-14 | 삼성전자 주식회사 | Method for controlling preview of picture taken in camera and mobile terminal implementing the same |
KR102134404B1 (en) * | 2013-08-27 | 2020-07-16 | 삼성전자주식회사 | Method for displaying data and an electronic device thereof |
-
2012
- 2012-02-21 WO PCT/US2012/025878 patent/WO2013022486A1/en active Application Filing
- 2012-02-21 EP EP12707443.3A patent/EP2740264B1/en active Active
- 2012-02-21 KR KR1020147002241A patent/KR20140044881A/en not_active Application Discontinuation
- 2012-02-21 CN CN201280043293.0A patent/CN103797784A/en active Pending
- 2012-02-21 US US14/236,105 patent/US9152235B2/en active Active
- 2012-02-21 JP JP2014523917A patent/JP6050352B2/en active Active
- 2012-02-21 BR BR112014002039-6A patent/BR112014002039B1/en active IP Right Grant
Cited By (82)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9229918B2 (en) | 2010-12-23 | 2016-01-05 | Microsoft Technology Licensing, Llc | Presenting an application change through a tile |
US10969944B2 (en) | 2010-12-23 | 2021-04-06 | Microsoft Technology Licensing, Llc | Application reporting in an application-selectable user interface |
US11126333B2 (en) | 2010-12-23 | 2021-09-21 | Microsoft Technology Licensing, Llc | Application reporting in an application-selectable user interface |
US10353566B2 (en) | 2011-09-09 | 2019-07-16 | Microsoft Technology Licensing, Llc | Semantic zoom animations |
US11392288B2 (en) | 2011-09-09 | 2022-07-19 | Microsoft Technology Licensing, Llc | Semantic zoom animations |
US9582157B1 (en) * | 2012-08-03 | 2017-02-28 | I4VU1, Inc. | User interface and program guide for a multi-program video viewing apparatus |
US20140298245A1 (en) * | 2013-03-28 | 2014-10-02 | Microsoft Corporation | Display Instance Management |
US9641891B2 (en) | 2013-06-17 | 2017-05-02 | Spotify Ab | System and method for determining whether to use cached media |
US9635416B2 (en) | 2013-06-17 | 2017-04-25 | Spotify Ab | System and method for switching between media streams for non-adjacent channels while providing a seamless user experience |
US9071798B2 (en) | 2013-06-17 | 2015-06-30 | Spotify Ab | System and method for switching between media streams for non-adjacent channels while providing a seamless user experience |
US9100618B2 (en) | 2013-06-17 | 2015-08-04 | Spotify Ab | System and method for allocating bandwidth between media streams |
US20150365719A1 (en) * | 2013-06-17 | 2015-12-17 | Spotify Ab | System and method for switching between audio content while navigating through video streams |
US20140368736A1 (en) * | 2013-06-17 | 2014-12-18 | Sporify AB | System and method for selecting media to be preloaded for adjacent channels |
US10110947B2 (en) | 2013-06-17 | 2018-10-23 | Spotify Ab | System and method for determining whether to use cached media |
US10455279B2 (en) * | 2013-06-17 | 2019-10-22 | Spotify Ab | System and method for selecting media to be preloaded for adjacent channels |
US9043850B2 (en) | 2013-06-17 | 2015-05-26 | Spotify Ab | System and method for switching between media streams while providing a seamless user experience |
US9503780B2 (en) * | 2013-06-17 | 2016-11-22 | Spotify Ab | System and method for switching between audio content while navigating through video streams |
US9066048B2 (en) | 2013-06-17 | 2015-06-23 | Spotify Ab | System and method for switching between audio content while navigating through video streams |
US9661379B2 (en) | 2013-06-17 | 2017-05-23 | Spotify Ab | System and method for switching between media streams while providing a seamless user experience |
US9654822B2 (en) | 2013-06-17 | 2017-05-16 | Spotify Ab | System and method for allocating bandwidth between media streams |
US9516082B2 (en) | 2013-08-01 | 2016-12-06 | Spotify Ab | System and method for advancing to a predefined portion of a decompressed media stream |
US9654531B2 (en) | 2013-08-01 | 2017-05-16 | Spotify Ab | System and method for transitioning between receiving different compressed media streams |
US10034064B2 (en) | 2013-08-01 | 2018-07-24 | Spotify Ab | System and method for advancing to a predefined portion of a decompressed media stream |
US10110649B2 (en) | 2013-08-01 | 2018-10-23 | Spotify Ab | System and method for transitioning from decompressing one compressed media stream to decompressing another media stream |
US9979768B2 (en) | 2013-08-01 | 2018-05-22 | Spotify Ab | System and method for transitioning between receiving different compressed media streams |
US10097604B2 (en) | 2013-08-01 | 2018-10-09 | Spotify Ab | System and method for selecting a transition point for transitioning between media streams |
US10191913B2 (en) | 2013-09-23 | 2019-01-29 | Spotify Ab | System and method for efficiently providing media and associated metadata |
US9654532B2 (en) | 2013-09-23 | 2017-05-16 | Spotify Ab | System and method for sharing file portions between peers with different capabilities |
US9529888B2 (en) | 2013-09-23 | 2016-12-27 | Spotify Ab | System and method for efficiently providing media and associated metadata |
US9716733B2 (en) | 2013-09-23 | 2017-07-25 | Spotify Ab | System and method for reusing file portions between different file formats |
US9917869B2 (en) | 2013-09-23 | 2018-03-13 | Spotify Ab | System and method for identifying a segment of a file that includes target content |
US20150100885A1 (en) * | 2013-10-04 | 2015-04-09 | Morgan James Riley | Video streaming on a mobile device |
US9792010B2 (en) | 2013-10-17 | 2017-10-17 | Spotify Ab | System and method for switching between media items in a plurality of sequences of media items |
US9063640B2 (en) | 2013-10-17 | 2015-06-23 | Spotify Ab | System and method for switching between media items in a plurality of sequences of media items |
US10397640B2 (en) * | 2013-11-07 | 2019-08-27 | Cisco Technology, Inc. | Interactive contextual panels for navigating a content stream |
US20150128046A1 (en) * | 2013-11-07 | 2015-05-07 | Cisco Technology, Inc. | Interactive contextual panels for navigating a content stream |
US9990125B2 (en) * | 2013-11-15 | 2018-06-05 | Lg Electronics Inc. | Mobile terminal and method of controlling the same |
US20150143238A1 (en) * | 2013-11-15 | 2015-05-21 | Lg Electronics Inc. | Mobile terminal and method of controlling the same |
US9965172B2 (en) * | 2013-12-05 | 2018-05-08 | Naver Corporation | Video transition method and video transition system |
US20150160853A1 (en) * | 2013-12-05 | 2015-06-11 | Naver Corporation | Video transition method and video transition system |
USD765139S1 (en) * | 2013-12-24 | 2016-08-30 | Tencent Technology (Shenzhen) Company Limited | Portion of a display screen with graphical user interface |
USD770325S1 (en) * | 2013-12-24 | 2016-11-01 | Tencent Technology (Shenzhen) Company Limited | Penguin figurine |
US10222935B2 (en) | 2014-04-23 | 2019-03-05 | Cisco Technology Inc. | Treemap-type user interface |
US20170188087A1 (en) * | 2014-06-11 | 2017-06-29 | Sumsung Electronics Co., Ltd. | User terminal, method for controlling same, and multimedia system |
US20210343316A1 (en) * | 2014-07-23 | 2021-11-04 | Gopro, Inc. | Scene and activity identification in video summary generation |
US11776579B2 (en) * | 2014-07-23 | 2023-10-03 | Gopro, Inc. | Scene and activity identification in video summary generation |
US20160065881A1 (en) * | 2014-09-03 | 2016-03-03 | Samsung Electronics Co., Ltd. | Display apparatus and method of controlling the same |
US10073592B2 (en) * | 2015-06-18 | 2018-09-11 | Apple Inc. | Device, method, and graphical user interface for navigating media content |
US10545635B2 (en) | 2015-06-18 | 2020-01-28 | Apple Inc. | Device, method, and graphical user interface for navigating media content |
US11816303B2 (en) | 2015-06-18 | 2023-11-14 | Apple Inc. | Device, method, and graphical user interface for navigating media content |
US20160370957A1 (en) * | 2015-06-18 | 2016-12-22 | Apple Inc. | Device, Method, and Graphical User Interface for Navigating Media Content |
US10572109B2 (en) | 2015-06-18 | 2020-02-25 | Apple Inc. | Device, method, and graphical user interface for navigating media content |
US10073591B2 (en) | 2015-06-18 | 2018-09-11 | Apple Inc. | Device, method, and graphical user interface for navigating media content |
US11262890B2 (en) | 2015-09-08 | 2022-03-01 | Apple Inc. | Devices, methods, and graphical user interfaces for moving a current focus using a touch-sensitive remote control |
US10963130B2 (en) | 2015-09-08 | 2021-03-30 | Apple Inc. | Devices, methods, and graphical user interfaces for moving a current focus using a touch-sensitive remote control |
US10474333B2 (en) | 2015-09-08 | 2019-11-12 | Apple Inc. | Devices, methods, and graphical user interfaces for moving a current focus using a touch-sensitive remote control |
US9928029B2 (en) | 2015-09-08 | 2018-03-27 | Apple Inc. | Device, method, and graphical user interface for providing audiovisual feedback |
US10599394B2 (en) | 2015-09-08 | 2020-03-24 | Apple Inc. | Device, method, and graphical user interface for providing audiovisual feedback |
US11960707B2 (en) | 2015-09-08 | 2024-04-16 | Apple Inc. | Devices, methods, and graphical user interfaces for moving a current focus using a touch-sensitive remote control |
US10152300B2 (en) | 2015-09-08 | 2018-12-11 | Apple Inc. | Device, method, and graphical user interface for providing audiovisual feedback |
US9990113B2 (en) | 2015-09-08 | 2018-06-05 | Apple Inc. | Devices, methods, and graphical user interfaces for moving a current focus using a touch-sensitive remote control |
US11635876B2 (en) | 2015-09-08 | 2023-04-25 | Apple Inc. | Devices, methods, and graphical user interfaces for moving a current focus using a touch-sensitive remote control |
USD833473S1 (en) * | 2015-09-24 | 2018-11-13 | 4Thought Sa | Display screen or portion thereof with graphical user interface |
USD826959S1 (en) * | 2016-05-27 | 2018-08-28 | Axis Ab | Display screen or portion thereof with graphical user interface |
US11388455B2 (en) * | 2016-06-02 | 2022-07-12 | Multimo, Llc | Method and apparatus for morphing multiple video streams into single video stream |
US20170371510A1 (en) * | 2016-06-28 | 2017-12-28 | Fuji Xerox Co., Ltd. | Information processing apparatus, information processing system, and image forming apparatus |
US10372520B2 (en) | 2016-11-22 | 2019-08-06 | Cisco Technology, Inc. | Graphical user interface for visualizing a plurality of issues with an infrastructure |
US11016836B2 (en) | 2016-11-22 | 2021-05-25 | Cisco Technology, Inc. | Graphical user interface for visualizing a plurality of issues with an infrastructure |
US10739943B2 (en) | 2016-12-13 | 2020-08-11 | Cisco Technology, Inc. | Ordered list user interface |
US11042262B2 (en) | 2017-02-01 | 2021-06-22 | Opentv, Inc. | Menu modification based on controller manipulation data |
EP3358851A1 (en) * | 2017-02-01 | 2018-08-08 | OpenTV, Inc. | Menu modification based on controller manipulation data |
USD831700S1 (en) * | 2017-07-31 | 2018-10-23 | Shenzhen Valuelink E-Commerce Co., Ltd. | Display screen or portion thereof with graphical user interface |
US11164362B1 (en) | 2017-09-26 | 2021-11-02 | Amazon Technologies, Inc. | Virtual reality user interface generation |
USD916860S1 (en) | 2017-09-26 | 2021-04-20 | Amazon Technologies, Inc. | Display system with a virtual reality graphical user interface |
USD896235S1 (en) | 2017-09-26 | 2020-09-15 | Amazon Technologies, Inc. | Display system with a virtual reality graphical user interface |
US10614616B1 (en) | 2017-09-26 | 2020-04-07 | Amazon Technologies, Inc. | Virtual reality user interface generation |
EP3754476A4 (en) * | 2018-03-01 | 2021-03-31 | Huawei Technologies Co., Ltd. | Information display method, graphical user interface and terminal |
US11635873B2 (en) | 2018-03-01 | 2023-04-25 | Huawei Technologies Co., Ltd. | Information display method, graphical user interface, and terminal for displaying media interface information in a floating window |
US20210014570A1 (en) * | 2018-03-28 | 2021-01-14 | Huawei Technologies Co., Ltd. | Video Preview Method and Electronic Device |
US11785304B2 (en) * | 2018-03-28 | 2023-10-10 | Huawei Technologies Co., Ltd. | Video preview method and electronic device |
US10862867B2 (en) | 2018-04-01 | 2020-12-08 | Cisco Technology, Inc. | Intelligent graphical user interface |
US11922006B2 (en) | 2018-06-03 | 2024-03-05 | Apple Inc. | Media control for screensavers on an electronic device |
Also Published As
Publication number | Publication date |
---|---|
EP2740264A1 (en) | 2014-06-11 |
US9152235B2 (en) | 2015-10-06 |
KR20140044881A (en) | 2014-04-15 |
JP6050352B2 (en) | 2016-12-21 |
JP2014529212A (en) | 2014-10-30 |
EP2740264B1 (en) | 2016-10-19 |
CN103797784A (en) | 2014-05-14 |
BR112014002039A2 (en) | 2017-03-01 |
WO2013022486A1 (en) | 2013-02-14 |
BR112014002039B1 (en) | 2022-02-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9152235B2 (en) | Video peeking | |
JP5681193B2 (en) | Equipment and method for grid navigation | |
US10514832B2 (en) | Method for locating regions of interest in a user interface | |
JP5628424B2 (en) | System, method, and user interface for content search | |
KR100994011B1 (en) | A control framework with a zoomable graphical user interface for organizing, selecting and launching media items | |
US9665616B2 (en) | Method and system for providing media recommendations | |
US10275532B2 (en) | Method and system for content discovery | |
US20150003815A1 (en) | Method and system for a program guide | |
US20150339578A1 (en) | A method and system for providing recommendations |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: THOMSON LICENSING, FRANCE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WARDENAAR, MATTHEW JACOB;REEL/FRAME:036224/0762 Effective date: 20111028 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: INTERDIGITAL CE PATENT HOLDINGS, FRANCE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:THOMSON LICENSING;REEL/FRAME:047332/0511 Effective date: 20180730 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |
|
AS | Assignment |
Owner name: INTERDIGITAL CE PATENT HOLDINGS, SAS, FRANCE Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE RECEIVING PARTY NAME FROM INTERDIGITAL CE PATENT HOLDINGS TO INTERDIGITAL CE PATENT HOLDINGS, SAS. PREVIOUSLY RECORDED AT REEL: 47332 FRAME: 511. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:THOMSON LICENSING;REEL/FRAME:066703/0509 Effective date: 20180730 |