US20130159919A1 - Systems and Methods for Identifying and Suggesting Emoticons - Google Patents
Systems and Methods for Identifying and Suggesting Emoticons Download PDFInfo
- Publication number
- US20130159919A1 US20130159919A1 US13/330,357 US201113330357A US2013159919A1 US 20130159919 A1 US20130159919 A1 US 20130159919A1 US 201113330357 A US201113330357 A US 201113330357A US 2013159919 A1 US2013159919 A1 US 2013159919A1
- Authority
- US
- United States
- Prior art keywords
- emoticons
- emoticon
- user
- candidate
- segments
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/023—Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
- G06F3/0233—Character input methods
- G06F3/0236—Character input methods using selection techniques to select from displayed items
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/023—Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
- G06F3/0233—Character input methods
- G06F3/0237—Character input methods using prediction or retrieval techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04812—Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/205—Parsing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/274—Converting codes to words; Guess-ahead of partial word inputs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/279—Recognition of textual entities
- G06F40/289—Phrasal analysis, e.g. finite state techniques or chunking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/30—Semantic analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/40—Processing or translation of natural language
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/7243—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/18—Information format or content conversion, e.g. adaptation by the network of the transmitted or received information for the purpose of wireless delivery to users or terminals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/04—Real-time or near real-time messaging, e.g. instant messaging [IM]
Definitions
- the invention(s) described herein generally relate to emoticons. More particularly, the invention(s) relate to systems and methods for identifying and suggesting emoticons during various activities on a computing device.
- emoticons were facial expressions represented by characters (e.g., ASCII characters) commonly found on computer keyboards, such as letters, numbers, and symbols.
- characters e.g., ASCII characters
- These original emoticons once placed in an electronic message or an electronic posting by an author (e.g., electronic bulletin board), were meant to convey the author's mood or to convey/enhance the overall sentiment of the message or the posting.
- these emoticons were limited to expressing moods, such as happiness, anger, sadness, and indifference.
- character emoticons expanded to conveying meanings and messages.
- emoticons include character emoticons and emoticons represented by graphical images (hereafter, “graphical emoticons”).
- graphical emoticons With the availability of graphical emoticons, a user can depict a greater number of moods, meanings and messages not once possible with character emoticons alone.
- character and graphical emoticons are now available for use through a variety of digital devices (e.g., mobile telecommunication devices, and tablets), and are used in a variety of computing activities, especially with respect to the Internet.
- graphical emoticons are commonly available for use when drafting personal e-mails, when posting messages on the Internet (e.g., on social networking site or a web forum), and when messaging between mobile devices.
- the user may access emoticons through a menu or library from which they can browse and select emoticons for use in the computing activity.
- Various embodiments discussed herein provide systems and methods for identifying and suggesting emoticons for segments of texts. Some systems and methods may be utilized during a user activity on a computing device including, without limitation, instant messaging, participating in online chat rooms, drafting e-mails, posting web blogs, or posting to web forums.
- An exemplary method comprises receiving a set of segments from a text field, analyzing the set of segments to determine at least one of a target subtext or a target meaning associated with the set of segments, and identifying a set of candidate emoticons where each candidate emoticon in the set of candidate emoticons has an association between the candidate emoticon and at least one of the target subtext or the target meaning.
- the method may further comprise presenting the set of candidate emoticons for entry selection at a current position of an input cursor, receiving an entry selection for a set of selected emoticons from the set of candidate emoticons, and inserting the set of selected emoticons into the text field at the current position of the input cursor.
- the set of segments may comprise one or more segments of interest selected relative to a current position of an input cursor in the text field, the set of candidate emoticons may comprise one or more candidate emoticons, and the set of selected emoticons may comprise one or more selected emoticons.
- analyzing the set of segments may comprise semantic analysis of the set of segments.
- each association may comprise a statistical usage of the candidate emoticon with at least one of the target subtext or the target meaning. Additionally, for some embodiments, the method may further comprise updating the statistical usage of the candidate emoticons based on the entry selection for the set of selected emoticons. Depending on the embodiment, the statistical usage may be based on usage by a single user or by a plurality of users.
- Presenting the set of emoticons for entry selection may involve displaying the emoticon, for entry selection, at or near the current position of the input cursor.
- Presenting the set of candidate emoticons for entry selection may comprise displaying the set of candidate emoticons, for entry selection, on a physical input device or a virtual input device (e.g., on-screen keyboard, or a projected keyboard), wherein the physical input device and the displayed input interface are configured to execute the entry selection.
- the virtual input device may be displayed by a display device that is also displaying the text field. Additionally, the virtual input device may be displayed in close proximity to the text field.
- the method may further comprise identifying the set of segments using syntactical analysis.
- Each segment of interest may comprise at least one of a word, a sentence fragment, a sentence, a phrase, or a passage that precedes or follows a current position of an input cursor.
- identifying the set of candidate emoticons may be further based on at least a user preference, user-related information, or recipient-related information.
- the user-related information may include a user interest, a user ethnicity, a user religion, a user geographic location, a user age, a user relational status, and a user occupation.
- the recipient-related information may include a recipient's relation to a user, a recipient interest, a recipient ethnicity, a recipient religion, a recipient geographic location, a recipient age, a recipient relational status, and a recipient occupation.
- An exemplary system comprises a processor, a display module, an input module, a segment analysis module, an emoticon search module, an emoticon suggestion module, and an emoticon selection module.
- the display module may be configured to display a text field and one or more segments entered into the text field.
- the input module may be configured to receive segment input from a user and to enter the segment input into the text field at an input cursor.
- the segment analysis module may be configured to receive a set of segments from the text field, wherein the set of segments comprises one or more segments of interest selected relative to a current position of the input cursor in the text field.
- the segment analysis module may be further configured to use the processor to analyze the set of segments to determine at least one of a target subtext or a target meaning associated with the set of segments.
- the emoticon search module may be configured to identify a set of candidate emoticons, wherein each candidate emoticon in the set of candidate emoticons has an association between the candidate emoticon and at least one of the target subtext or the target meaning, and wherein the set of candidate emoticons comprises one or more candidate emoticons.
- the emoticon suggestion module may be configured to present the set of candidate emoticons through the display module for entry selection at the current position of the input cursor.
- the emoticon selection module may be configured to receive from the input module an entry selection for a set of selected emoticons from the set of candidate emoticons, wherein the set of selected emoticons comprises one or more selected emoticons.
- the emoticon selection module may be further configured to insert the set of selected emoticons into the text field at the current position of the input cursor.
- system further comprises an emoticon datastore comprising one or more emoticons capable of entry into the text field, and wherein the emoticon search module is further configured to identify a set of candidate emoticons on the emoticon datastore.
- each association may comprise a statistical usage of the candidate emoticon with at least one of the target subtext or the target meaning
- the emoticon selection module may be further configured to update the statistical usage of the candidate emoticons based on the entry selection for the set of selected emoticons.
- presenting the set of emoticons through the display module for entry selection may comprise displaying the emoticon, for entry selection, at or near the current position of the input cursor.
- the input module may comprise a physical input device or a virtual input device, wherein the physical input device and the virtual input interface are configured to execute the entry selection.
- FIG. 1 depicts an example of an environment in which various embodiments may be utilized.
- FIG. 2 is a block diagram of an exemplary emoticon suggestion system in accordance with some embodiments.
- FIG. 3 is a flow chart of an exemplary method for identifying and suggesting an emoticon in accordance with some embodiments.
- FIG. 4 is a block diagram of an exemplary emoticon suggesting system using a client-server architecture in accordance with some embodiments.
- FIG. 5 depicts a user interface of a messaging application, where the messaging application utilizes an embodiment.
- FIG. 6 depicts a user-interface of a messaging application, where the messaging application utilizes an embodiment.
- FIG. 7 is a block diagram of an exemplary digital device.
- a number of embodiments described herein relate to systems and methods that identify and suggest emoticons during a variety of activities on a computing device involving typing characters into a text field.
- Various systems and methods may identify the emoticon by analyzing a context of segments present in the text field and identifying one or more candidate emoticons available for entry into the text field based on that context. Subsequently, the user may select one or more emoticons from the candidate emoticons and the selected emoticons may be entered into the text field. Optionally, the user could choose to ignore the emoticon suggestion(s) entirely, and continue with their activities on the computing device.
- a “segment” may comprise one or more characters that represent a word, a phrase, a sentence fragment, a sentence, or a passage.
- analysis of the context of segments present in the text field may involve determining a subtext or a meaning relating to those segments, which may require semantic analysis of those segments. Also, as described herein, the association between a particular candidate emoticon and a particular subtext or meaning may be based on (past) statistical usage of the particular candidate emoticon with the particular subtext or meaning.
- such emoticon usage may be based on a user's personal usage of the particular emoticon with the particular subtext or meaning (e.g., user's selection of suggested emoticons in the particular subtext or meaning), or may be based on a community's usage of the particular emoticon with the particular subtext or meaning (e.g., observed usage of certain emoticons in postings on a social network by a community of users).
- FIG. 1 depicts an example of an environment 100 in which various embodiments may be utilized.
- the environment 100 comprises a tablet computing device 104 , a local emoticon datastore 102 coupled to the tablet computing device 104 , a smartphone computing device 108 , a local emoticon datastore 106 coupled to the smartphone computing device 108 , a desktop computing device 112 , a local emoticon datastore 114 coupled to the desktop computing device 112 , an emoticon suggestion server 116 , and a local emoticon datastore 118 coupled to the emoticon suggestion server 116 .
- the environment 100 further comprises a communication network 110 over which the tablet computing device 104 , the smartphone computing device 108 , the desktop computing device 112 , and the emoticon suggestion server 116 communicate.
- the tablet computing device 104 , the smartphone computing device 108 , the desktop computing device 112 , and the emoticon suggestion server 116 are examples of digital devices having a processor and memory.
- Other exemplary digital devices with which various embodiments may be utilized include laptops, netbooks, notebooks, media devices, music devices personal digital assistants (PDAs), or the like. Exemplary digital devices are further described in FIG. 7 .
- the tablet computing device 104 , the smartphone computing device 108 , and the desktop computing device 112 may be exemplary digital devices that utilize systems and methods for identifying and suggesting emoticons for entry.
- such computing devices may utilize certain embodiments to identify and suggest emoticons when a user is using an instant messaging application on such computing devices, or when the user is posting a message on a website forum through such computing devices.
- Those of ordinary skill in the art will appreciate that other digital devices could be utilized in conjunction with various embodiments described herein.
- the emoticon suggestion server 116 may facilitate the identification and suggestion of an emoticon for a user at a digital device. As later described herein, the emoticon suggestion server 116 may determine the context of a segment, may identify one or more candidate emoticons based on a determined context, may suggest one or more candidate emoticons to a digital device, or may perform some combination thereof. For various embodiments, the emoticon suggestion server 116 may be a service operating on a server that hosts an Internet service, where the emoticon suggestion server 116 provides emoticon suggestion functionality to the Internet service.
- the emoticon suggestion server 116 may be a service operating on a web server that is hosting a website (e.g., a website forum or a social networking website) that is being serviced by the emoticon suggestion server 116 (i.e., that is being provided emoticon suggestions by the emoticon suggestion server 116 ).
- a website e.g., a website forum or a social networking website
- various operations and components for identifying and suggesting an emoticon may be isolated to the digital device that utilizes the emoticon suggestions, or may be distributed on varying levels amongst two or more digital devices.
- a system or method for identifying, suggesting, and entering an emoticon when drafting an e-mail on the smartphone computing device 108 may be entirely embedded in an e-mail application that is stored and operated on the smartphone computing device 108 .
- a system or method for identifying, suggesting, and entering an emoticon may utilize the tablet computing device 104 to determine the context of the message as currently prepared, utilize the emoticon suggestion server 116 to identify one or more candidate emoticons for use in the message as currently prepared, and then utilize the tablet computing device 104 to present the candidate emoticons as suggested emoticons.
- the emoticon suggestion server 116 may utilize the remote emoticon datastore 118 during the identification and suggestion of emoticons to digital devices.
- the remote emoticon datastore 118 may comprise a library of emoticons available for suggestion by the emoticon suggestion server 116 , and associations between emoticons in the library and contexts (e.g., subtexts and meanings).
- the remote emoticon datastore 118 may comprise a library of “happy face” emoticons, and associations between the “happy face” emoticons and a happy context.
- the remote emoticon datastore 118 may comprise a library of “San Francisco” emoticons, and associations between the “San Francisco” emoticons and contexts that explicitly or implicitly refers to the city of San Francisco.
- the remote emoticon datastore 118 may comprise two or more associations between a given emoticon and a given context (e.g., subtext or meaning).
- the remote emoticon datastore 118 may comprise a library of “frowning face” emoticons, associations between the “frowning face” emoticons and a sad context, and associations between the “frowning face” emoticons and a displeased context.
- a variety of emoticon libraries and a variety of association between emoticons and contexts can be stored on the remote emoticon datastore 118 .
- the library of emoticons may comprise emoticons that are accessible by any user or accessible by a limited group of users restricted access (e.g., based on a premium, or only accessible to certain groups), user-customized or user-uploaded emoticons, or emoticons that are user favorites.
- emoticons used in various embodiments may include those that relate to interests, hobbies, geographic locations, events, holidays, seasons, weather, and the like.
- Emoticons stored on the emoticon suggestion datastore 118 may include character emoticons, graphical emoticons, graphically animated emoticons, and emoticons accompanied by sound.
- the remote emoticon datastore 118 may further comprise user preferences, user information or recipient information, which may be utilized the embodiments when identifying emoticons suitable for suggestion.
- the remote emoticon datastore 118 may store a user preference that causes an embodiment to suggest user-defined or user-uploaded emoticons before suggesting emoticons generally available to any user.
- the remote emoticon datastore 118 may store a user preference that causes an embodiment to automatically insert the first emoticon suggested to the user by the embodiment, or to automatically insert the suggested emoticon having the highest usage in a given context.
- the tablet computing device 104 , the smartphone computing device 108 , and the desktop computing device 112 may each be coupled to a separate, local emoticon datastore capable of storing user-customized emoticons, a user's favorite or preferred emoticons, associations between emoticons stored on the local emoticon and contexts (e.g., subtext or meaning), user preferences with respect to identifying and suggestion emoticons, user-related information, or recipient-related information.
- the tablet computing device 104 may be coupled to the local emoticon datastore 102
- the smartphone computing device 108 may be coupled to the local emoticon datastore 106 coupled
- the desktop computing device 112 may be coupled to the local emoticon datastore 114 .
- each of the local emoticon datastores 102 , 106 , and 114 may be utilized by their respective computing device to locally cache previously suggested emoticons or suggested emoticons previously selected by a user. In doing so, some embodiments can repeatedly suggest the same emoticons for a commonly occurring contexts while limiting the number of times the emoticon suggestions server 116 is queried for the suggested emoticons.
- the emoticons cached in the local emoticon datastores 102 , 106 , and 114 may have an expiration time, after which the cached emoticons are invalidated or purged. Once an emoticon item in the cache has expired, some embodiments resume querying the emoticon suggestion server 116 for suggested emoticons.
- FIG. 2 is a block diagram of an exemplary emoticon suggestion system 200 in accordance with some embodiments.
- the emoticon suggestion system 200 may comprise a display module 202 , an input module 204 , a segment analysis module 206 , an emoticon search module 208 , an emoticon suggestion module 210 , an emoticon selection module 212 , and an emoticon datastore 214 .
- the emoticon suggestion system 200 may further comprise memory and at least one processor, which facilitate operation of various modules contained in the emoticon suggestion system 200 .
- the display module 202 may display an input field, such as a text field or a text box, into which a user can input one or more segments, character emoticons, or graphical emoticons using the input module 204 .
- an input field such as a text field or a text box
- segments and emoticons are entered into the input field they appear in the input field.
- a “segment” may comprise one or more characters that represent a word, a phrase, a sentence fragment, a sentence, or a passage.
- the display module 202 may display an input cursor, which indicates where a user's character inputs will be next entered or where an emoticon may be next entered.
- various embodiments may suggest emoticons based on the current position of the input cursor within the input field, the present segment content of the input, user-related information, recipient-related information, user preferences, or some combination thereof.
- the candidate emoticons may be suggested to the user via the display module 202 .
- the display module 202 may, for the user's selection, display the candidate emoticons at or near the current position of the input cursor in the input field.
- the display module 202 may display the candidate emoticons at or near the input field via a callout box.
- the display module 202 may form part of a digital device (e.g., video display, or video projector) that may be responsible for displaying all graphical output from the digital device.
- the display module 202 may display the input field as part of a graphical user interface (GUI).
- GUI graphical user interface
- the input field may be a graphical component of an application operating on a digital device (e.g., e-mail client, or an instant messaging application), or may be a graphical representation of a document viewable or editable through an application operating on the digital device (e.g., a text field of a web page shown through a web browser, or a document shown through a word processor).
- GUI graphical user interface
- the input field may vary in type and size from embodiment to embodiment.
- the input module 204 may receive character input from a user and enter such character input into the input field as received. As character input is entered into the input field, the display module 202 may update the input field with the character input. Additionally, the input module 204 may further receive entry selections for emoticons suggested, in accordance with various embodiments. Generally, upon selection, the selected emoticons may be inserted at the current position of the input cursor in the input field.
- the input module may comprise a physical input device that is externally coupled to a digital device or that is physical embedded into the digital device. Examples of physical input devices can include, without limitation, keyboards, trackpads or computer mice.
- the input module may comprise a virtual input device, such as a laser-projected keyboard or an on-screen keyboard, which may be provided (i.e., displayed) to the user through the display module 202 .
- a virtual input device such as a laser-projected keyboard or an on-screen keyboard, which may be provided (i.e., displayed) to the user through the display module 202 .
- virtual input devices may be displayed at or near the input field to which segments will be inputted.
- suggested emoticons may be presented to the user through the input module 204 .
- the physical keyboard may be configured to display suggested emoticons through the physical keyboard.
- the physical keyboard may display suggested emoticons by way of keys or buttons that comprise embedded displays (e.g., LCD buttons), or by way of a display embedded on a surface of the physical keyboard (e.g., at the top of the keyboard).
- the suggested emoticons may be displayed through the physical keyboard in color or in grayscale. As the suggested emoticons are displayed through the physical keyboard, the user may select one or more of those suggested emoticons through keys or buttons of the physical keyboard.
- the appearance of the on-screen keyboard may be reconfigured to display the suggested emoticons through the on-screen keyboard.
- the appearance of the on-screen keyboard may be reconfigured so that certain buttons of the on-screen keyboard are replaced with suggested emoticons buttons, or so that the on-screen keyboard is augmented with additional suggested emoticon buttons.
- the suggested emoticon buttons may be used by a user to select from the one or more suggested emoticons.
- the segment analysis module 206 may analyze one or more segments present in the input field and determine a context for the segments analyzed. As described herein, the context determined by the segment analysis module 206 may be subsequently utilized when identifying candidate emoticons to be suggested to the user. In various embodiments, the segment analysis module 206 may analyze only segments of interest from the input field when determining the context of segments in the input field.
- the segment analysis module 206 first identifies segments of interest in the input field, and then analyzes those segments of interest to determine a context. Generally, the segments of interest are identified in relation to a current position of an input cursor in the input field. Additionally for some embodiments, the segment analysis module 206 may perform syntactical analysis of the segments currently present in the input field when identifying segments of interest.
- the segment analysis module 206 may identify the segments of interest based on conditional or non-conditional rules that guide the segment of interest identification process.
- An exemplary rule for identifying segments of interest may include identifying the sentence fragment or sentence immediately preceding the current position of the input cursor in the input field as a segment of interest.
- Another exemplary rule for identifying segments of interest may include identifying the sentence fragment or sentence immediately following the current position of the input cursor in the input field as a segment of interest.
- the rules may be utilized in conjunction with the syntactical analysis performed by the segment analysis module 206 to determine the segments of interest.
- the segment analysis module 206 may analyze the context of each of the segments of interest, or may analyze the context of all but the least important segments of interest (e.g., based on a weight system, where certain segments of interest are of higher importance than others). In addition, one or more rules may determine which of the segments of interests should be analyzed when two or more segments of interest are identified.
- the segment analysis module 206 may determine two or more contexts from the segments of interest. In such cases, the emoticon suggestion system 200 may search for candidate emoticons associated with all of the determined contexts, or may only search for candidate emoticons that match one or more of the most important contexts (e.g., determined based on rules).
- the segment analysis module 206 may semantically analyze the segments of interest present in the input field. Those of skill in the art will appreciate that the semantic analysis of segments may be performed in accordance with one or more techniques known in the art.
- the segment analysis module 206 may determine a subtext or a meaning for the segments of interest. Based on the subtext or meaning identified for the segments of interest, the emoticon suggestion system 200 may identify one or more candidate emoticons for suggestion.
- the subtext of a segment of the interest may identify a mood or an emotion for that segment of interest.
- Example subtexts for segments of interest may include, without limitation, happiness, sadness, indifference, anger, resentment, contrition, or excitement.
- the meaning for segments of the interest may identify an explicit meaning for segments of interest. For example, where a segment of interest recites “I just got a new job!,” the segment analysis module 206 may identify the meaning for the segment of interest as “new job.”
- the segment analysis module 206 may identify and analyze segments of interest in at or near real-time as the user adds characters or emoticons to or removes characters or emoticons from the input field using the input module 204 .
- the emoticon search module 208 may search for one or more candidate emoticons based on an identified context (e.g., subtext or meaning) of a segment of interest.
- the emoticon search module 208 may search the emoticon datastore 214 for emoticons associated with the one or more contexts identified by the emoticon suggestion system 200 .
- the emoticon datastore 214 may comprise emoticon available for entry into the input field, and associations between an emoticon and one or more contexts.
- the association between a given emoticon and a given contexts may comprise statistical usage of the given emoticon with that given context.
- the strength of the association between the given emoticon and the given context may be based on such statistical usage.
- the statistical usage may be based on the user's own usage of the given emoticon with the given content, or may be based on usage of the given emoticon with the given content by a community of users (e.g., usage of a given emoticon in a given context on a social networking website).
- the more usage of a given emoticon with a given context the stronger the association between that given emoticon and that given context.
- the strength of the association between an emoticon and a context may indicate the confidence in suggesting the emoticon for that context.
- the strength of the association may also be used to prioritize and present the one or more candidate emoticons from the highest strength to the lowest strength.
- the search for one or more candidate emoticons by the emoticon search engine module 208 may consider the strength of the association between the emoticon and the context. For example, the emoticon search engine module 208 may only identify an emoticon as a candidate emoticon if the strength of the association between the emoticon and the target context meets or exceeds a predetermined threshold. Additionally, the emoticon search engine module 208 may only identify an emoticon as a candidate emoticon when the strength of the association between the emoticon and the target context meets or exceeds a threshold relative to other, potential candidate emoticons.
- the emoticon search module 208 may further narrow the search for candidate emoticons by utilizing user preferences, user-related information, recipient-related information, or some combination thereof.
- user preferences may include, without limitation, a preference to suggest user-customized emoticons before other types of emoticons, and a preference to ignore certain categories of emoticons (e.g., suggest only emoticons that are age appropriate with respect to the user).
- recipient-related information may include, without limitation, a user interest, a user ethnicity, a user religion, a user geographic location, a user age, a user relational status, and a user occupation.
- Example of user-related information may include, without limitation, a recipient's relation to a user, a recipient interest, a recipient ethnicity, a recipient religion, a recipient geographic location, a recipient age, a recipient relational status, and a recipient occupation.
- the emoticon search module 208 may further consider the input field's limitations in receiving character or graphical emoticons and search for candidate emoticons accordingly.
- the emoticon suggestion module 210 may receive the one or more candidate emoticons located based on an identified context of a segment of interest, and present the one or more candidate emoticons to the user for selection. As noted herein, in some embodiments, the emoticon suggestion module 210 may use the display module 202 to display for entry selection the one or more candidate emoticons at or near the current position of the input cursor in the input field. As also noted herein, in various embodiments, the emoticon suggestion module 210 may use the input module 202 to display for entry selection the one or more candidate emoticons through a physical input device or a virtual input device.
- the emoticon selection module 212 may receive from the user an entry selection for one or more candidate emoticons suggested to the user.
- the emoticon selection module 212 may receive the entry selection for the one or more candidate emoticons through the input module 204 , and the emoticon selection module 212 may enter the one or more selected emoticons into the input field.
- the emoticon selection module 212 may enter the one or more selected emoticons at the current position of the input cursor.
- the emoticon selection module 212 may enter the one or more selected emoticons into the input field by replacing segments or segments of interest within the input field with the one or more selected emoticons.
- some embodiments may enable the user to set the emoticon selection module 212 (e.g., using a user preference) such that the module 212 auto-selects suggested emoticons based on certain guidelines. For instance, the user may configure the emoticon selection module 212 such that the first suggested emoticon is selected when an emoticon suggestion is made.
- the emoticon selection module 212 may update the statistical usage information based on the entry selection received from the user. In particular, the emoticon selection module 212 may receive the entry selection of one or more candidate emoticons for a given context, and update the statistical usage information stored between the selected candidate emoticons and their respective contexts of usage. Depending on the embodiment, the emoticon selection module 212 may update the statistical usage information on the emoticon datastore 214 .
- the emoticon datastore 214 may comprise a library of emoticons available for suggestion by the emoticon suggestion system 200 , and associations between emoticons in the library and contexts (e.g., subtexts and meanings).
- the emoticon search module 208 may access the emoticon datastore 214 when searching for one or more candidate emoticons that are associated with one or more particular contexts.
- the emoticon datastore 214 may comprise two or more associations between a given emoticon and a given context (e.g., subtext or meaning).
- the association between a given emoticon and a given context may comprise statistical usage of the given emoticon with the given context. Such statistical usage may reflect the strength of the association between the emoticon and the context.
- Emoticons stored on the emoticon datastore 214 may include character emoticons, graphical emoticons, graphically animated emoticons, and emoticons accompanied by sound.
- the emoticon datastore 214 may further comprise user preferences, user information or recipient information, which may be utilized the embodiments when identifying emoticons suitable for suggestion.
- the emoticon datastore 214 may store a user preference that causes an embodiment to suggest user-defined or user-uploaded emoticons before suggesting emoticons generally available to any user.
- the emoticon datastore 214 may store a user preference that causes an embodiment to automatically insert the first emoticon suggested to the user by the embodiment, or to automatically insert the suggested emoticon having the highest usage in a given context.
- emoticon libraries and a variety of association between emoticons and contexts may be stored on the emoticon datastore 214 .
- a “module” may comprise software, hardware, firmware, and/or circuitry.
- one or more software programs comprising instructions capable of being executable by a processor may perform one or more of the functions of the modules described herein.
- circuitry may perform the same or similar functions.
- Alternative embodiments may comprise more, less, or functionally equivalent modules and still be within the scope of present embodiments.
- the functions of the various modules may be combined or divided differently.
- the functions of various modules may be distributed amongst one or more modules residing at an emoticon suggestion server and one or more modules reside at an emoticon suggestion client.
- FIG. 3 is a flow chart of an exemplary method 300 for identifying and suggesting an emoticon in accordance with some embodiments.
- the segment analysis module 206 may receive one or more segments from an input field, which may be displayed through the display module 202 . As noted herein, upon receiving the one or more segments, the segment analysis module 206 may identify segments of interest for context analysis purposes.
- the segment analysis module 206 may analyze the one or more segments to determine one or more target subtexts or one or more target meanings of the segments.
- the target subtexts and the target meanings of the segments provide for one or more contexts associated with the segments.
- the segment analysis module 206 may analyze only those segments which have been identified as segments of interest by the segment analysis module 206 .
- the emoticon search module 208 may identify one or more candidate emoticons having an association with the one or more target contexts or one or more target meanings, which may have been determined by the segment analysis module 206 .
- the emoticon search module 208 may identify one or more candidate emoticons in the emoticon datastore 214 which have an association with the target subtexts or the target meanings.
- the strength of each association may be based on statistical usage of a given emoticon with a given context, and such the strength may be taken into consideration as the emoticon search module 208 identifies one or more candidate emoticons.
- the emoticon suggestion module 210 may present the one or more candidate emoticons to a user for entry selection at a current position of an input cursor in an input field. As described herein, the input field and the input cursor therein may be displayed to the user through the display module 202 .
- the emoticon suggestion module 210 may present the one or more candidate emoticons to the user for entry selection using display module 202 , and may display the candidate emoticons at or near the current position of the input cursor in the input field.
- the emoticon suggestion module 210 may present the one or more candidate emoticons to the user for entry selection through one or more input devices of the input module 204 .
- the emoticon suggestion module 210 may present the one or more candidate emoticons to the user through a physical input device, such as a physical keyboard having a display, or through a virtual input device, such as an on-screen keyboard.
- the emoticon selection module 212 may receive an entry selection from the user for one or more select emoticons from the one or more candidate emoticons. For some embodiments, the emoticon selection module 212 may receive the entry selection from the input module 204 . Additionally, upon receiving the entry selection, the emoticon selection module 212 may update the statistical usage information on the emoticon datastore 214 for the one or more candidate emoticons based on the entry selection, thereby strengthen or weakening the association between the candidate emoticons and particular contexts.
- the emoticon selection module 212 may insert the one or more candidate emoticons into the text field at the current position of the input cursor received by the emoticon selection module 212 .
- entry of the candidate emoticons into the input field may involve replacing one or more segments in the input field with the selected emoticons.
- FIG. 4 is a block diagram of an exemplary emoticon suggesting system 400 using a client-server architecture in accordance with some embodiments.
- the emoticon suggesting system 400 may comprise an emoticon suggestion client 402 and an emoticon suggestion server 420 .
- the emoticon suggestion client 402 may be similar to the digital device described in FIG. 7 , or to the computing devices described in FIG. 1 (i.e., tablet computing device 104 , the smartphone computing device 108 , and the desktop computing device 112 ), and the emoticon suggestion server 420 may be similar to the digital device described in FIG. 7 , or to the emoticon suggestion server 116 described in FIG. 1 .
- the emoticon suggestion client 402 and the emoticon suggestion server 420 may communicate with one another over a communication network 418 .
- the emoticon suggestion client 402 may comprise a display module 404 , an input module 406 , a segment analysis module 408 , an emoticon search module 410 , a local emoticon datastore 412 , an emoticon suggestion module 414 , and an emoticon selection module 416 .
- the emoticon suggestion server 420 may comprise an emoticon search engine 422 , and a remote emoticon datastore 424 .
- the display module 404 may display an input field into which a user can input one or more segments, character emoticons, or graphical emoticons using the input module 406 . Typically, as segments and emoticons are entered into the input field they appear in the input field. With the input field, the display module 404 may display an input cursor in the input field, where the input cursor indicates where a user's character inputs will be next entered or where an emoticon may be next entered.
- Various embodiments may suggest emoticons based a number of factors including, for example, the current position of the input cursor within the input field, the present segment content of the input, user-related information, recipient-related information, user preferences, or some combination thereof.
- the candidate emoticons once identified, may be suggested to the user via the display module 404 .
- the display module 404 may, for the user's selection, display the candidate emoticons at or near the current position of the input cursor in the input field.
- the display module 404 may display the candidate emoticons at or near the input field via a callout box.
- a digital device may display all graphical output from the digital device.
- the display module 404 may display the input field as part of a graphical user interface (GUI).
- GUI graphical user interface
- the input field may be a graphical component of an application operating on a digital device, or may be a graphical representation of a document viewable or editable through an application operating on the digital device. It will be appreciated by those of ordinary skill in the art that the input field may vary in type and size from embodiment to embodiment.
- the input module 406 may receive character input from a user and enter such character input into the input field as received. As character input is entered into the input field, the display module 404 may update the input field with the character input. Additionally, the input module 406 may further receive entry selections for emoticons suggested in accordance with various embodiments. Generally, upon selection, the selected emoticons may be inserted at the current position of the input cursor in the input field.
- the input module may comprise a physical input device that is externally coupled to a digital device or that is physical embedded into the digital device, or a virtual input device, such as an on-screen keyboard, which may be provided to the user through the display module 404 . In various embodiments, as virtual input devices are employed, such virtual input devices may be displayed at or near the input field to which segments will be inputted.
- suggested emoticons may be presented to the user through the input module 406 .
- the physical keyboard may be configured to display suggested emoticons through the physical keyboard.
- the physical keyboard may display suggested emoticons by way of keys or buttons that comprise embedded displays (e.g., LCD buttons), or by way of a display embedded on a surface of the physical keyboard (e.g., at the top of the keyboard).
- the suggested emoticons may be displayed through the physical keyboard in color or in grayscale. As the suggested emoticons are displayed through the physical keyboard, the user may select one or more of those suggested emoticons through keys or buttons of the physical keyboard.
- the appearance of the on-screen keyboard may be reconfigured to display the suggested emoticons through the on-screen keyboard.
- the appearance of the on-screen keyboard may be reconfigured so that certain buttons of the on-screen keyboard are replaced with suggested emoticons buttons, or so that the on-screen keyboard is augmented with additional suggested emoticon buttons.
- the suggested emoticon buttons may be used by a user to select from the one or more suggested emoticons.
- the segment analysis module 408 may analyze one or more segments present in the input field and determine a context for the segments analyzed. As described herein, the context determined by the segment analysis module 408 may be subsequently utilized when identifying candidate emoticons to be suggested to the user. In various embodiments, the segment analysis module 408 may first identify segments of interest in the input field and then only analyze those segments of interest when determining the context of segments in the input field.
- the segment analysis module 408 may perform syntactical analysis of the segments currently present in the input field when identifying segments of interest. Additionally, the segment analysis module 408 may identify the segments of interest based on conditional or non-conditional rules that guide the segment of interest identification process.
- the segment analysis module 408 may semantically analyze the segments of interest present in the input field. When analyzing the context of one or more segments of interest, the segment analysis module 408 may determine a subtext or a meaning of the segments of interest. The subtext of a segment of the interest may identify a mood or an emotion for that segment of interest. Based on the subtext or meaning identified for the segments of interest, the emoticon suggestion system 400 may identify one or more candidate emoticons for suggestion.
- the segment analysis module 408 may identify and analyze segments of interest in at or near real-time as the user adds characters or emoticons to or removes characters or emoticons from the input field using the input module 408 .
- the emoticon search module 410 may search for one or more candidate emoticons based on an identified context (e.g., subtext or meaning) of a segment of interest. In some embodiments, the emoticon search module 410 may access the local emoticon datastore 412 when searching for one or more candidate emoticons that are associated with one or more particular contexts.
- an identified context e.g., subtext or meaning
- the emoticon search module 410 may access the local emoticon datastore 412 when searching for one or more candidate emoticons that are associated with one or more particular contexts.
- the local emoticon datastore 412 may store user-customized emoticons, a user's favorite or preferred emoticons, associations between emoticons stored on the local emoticon and contexts (e.g., subtext or meaning), user preferences with respect to identifying and suggestion emoticons, user-related information, or recipient-related information. Additionally, local emoticon datastore 412 may be utilized to locally cache previously suggested emoticons or suggested emoticons previously selected by the user.
- the emoticon search module 410 may utilize the emoticon suggestion server 420 to search for and provide candidate emoticons to the emoticon suggestion client 402 .
- the emoticon suggestion server 420 may search for candidate emoticons on the remote emoticon datastore 424 and provide resulting candidate emoticons to the emoticon search module 410 on the emoticon suggestion client 402 .
- the emoticon suggestion server 420 may use the emoticon search engine 422 to search for candidate emoticons on the remote emoticon datastore 424 , to retrieve candidate emoticons from the remote emoticon datastore 424 , and to provide the candidate emoticons to the emoticon search module 410 .
- the remote emoticon datastore 424 may comprise a library of emoticons available for suggestion to the emoticon suggestion client 402 .
- the remote emoticon datastore 424 may further comprise associations between emoticons in the library and contexts.
- the associations comprise statistical usage of the given emoticon of the emoticons in the library with the context. Generally, such statistical usage may reflect the strength of the association between the emoticon and the context.
- emoticons stored on the remote emoticon datastore 424 may include character emoticons, graphical emoticons, graphically animated emoticons, and emoticons accompanied by sound.
- the remote emoticon datastore 424 may further comprise user preferences, user information or recipient information, which may be utilized the embodiments when identifying emoticons suitable for suggestion.
- user preferences e.g., user information or recipient information
- emoticons e.g., user preferences, user information or recipient information
- Those skilled in the art would appreciate that a variety of emoticon libraries and a variety of association between emoticons and contexts can be stored on the remote emoticon datastore 424 .
- the emoticon suggestion module 414 may receive the one or more candidate emoticons located based on an identified context of a segment of interest, and present the one or more candidate emoticons to the user for selection. As noted herein, in some embodiments, the emoticon suggestion module 414 may use the display module 414 to display for entry selection the one or more candidate emoticons at or near the current position of the input cursor in the input field. As also noted herein, in various embodiments, the emoticon suggestion module 414 may use the input module 406 to display for entry selection the one or more candidate emoticons through a physical input device or a virtual input device.
- the emoticon selection module 416 may receive from the user an entry selection for one or more candidate emoticons suggested to the user.
- the emoticon selection module 416 may receive the entry selection for the one or more candidate emoticons through the input module 404 , and the emoticon selection module 416 may enter the one or more selected emoticons into the input field.
- the emoticon selection module 416 may enter the one or more selected emoticons at the current position of the input cursor. Additionally, the emoticon selection module 416 may enter the one or more selected emoticons into the input field by replacing segments or segments of interest within the input field with the one or more selected emoticons.
- Some embodiments may enable the user to set the emoticon selection module 416 (e.g., using a user preference) such that the module 416 auto-selects suggested emoticons based on certain guidelines. For instance, the user may configure the emoticon selection module 416 such that the first suggested emoticon is selected when an emoticon suggestion is made.
- the emoticon selection module 416 may update the statistical usage information based on the entry selection received from the user.
- the emoticon selection module 416 may receive the entry selection of one or more candidate emoticons for a given context, and update the statistical usage information stored between the selected candidate emoticons and their respective contexts of usage.
- the emoticon selection module 416 may update the statistical usage information on the local emoticon datastore 412 or on the remote emoticon datastore 424 . For example, if the one or more candidate emoticons selected through the emoticon selection module 416 were provided from the emoticon suggestion server 420 , the statistical usage information for those candidate emoticons will be updated on the remote emoticon datastore 424 . In another example, if the one or more candidate emoticons selected through the emoticon selection module 416 were provided from the local emoticon datastore 412 , the statistical usage information for those candidate emoticons will be updated on the locate emoticon datastore 412 .
- FIG. 5 depicts a user-interface 500 of a messaging application, where the messaging application utilizes an embodiment.
- a user may utilize the user interface 500 to receive and review messages received from other users over online chat, and to compose and transmit messages to other users over online chat.
- the messaging application may a client on an instant messaging system, where the messaging application is operating on a digital device local to the user, such a smartphone computing device or a laptop.
- the instant messaging system may operate on another digital device such as a server, where the messaging application interfaces with the instant messaging system.
- the messaging application may operate on a digital device as a standalone application, or as an applet, plug-in, or script operating through a web browser.
- the user interface 500 of the messaging application may comprise a conversation pane 502 , a message input field 504 , and a send button 514 .
- the conversation pane 502 may comprise messages submitted to the online chat.
- the conversation pane 502 may include messages submitted to the online chat from others, and messages submitted by the user through the user interface 500 .
- the user may submit messages to the online chat using the message input field 504 .
- the user may enter a message into the message input field 504 and press the send button 514 when the user desires to submit the message to the online chat.
- the message input field 504 may comprise and may be configured to receive a message prepared by the user for submission to the online chat.
- the message input field 504 may receive one or more segments from the user, or may receive one or more emoticons entered in accordance with some embodiments.
- the message input field 504 may further comprise an input cursor 516 .
- various embodiments may suggest emoticons for entry at the current position of the input cursor 516 .
- the embodiment may suggest a “smiley face” graphical emoticon 510 for entry into the input field 504 based on the embodiment's analysis of the segment of interest 512 , which recites “so much fun.”
- the embodiment may suggest the “smiley face” graphical emoticon 510 based on an association between the “smiley face” graphical emoticon 510 and the context of the segment of interest 512 .
- the embodiment may enter the “smiley face” graphical emoticon 510 into the message input field 504 .
- the embodiment may suggest a plurality of graphical emoticons 506 based on the context analysis of the segment of interest 514 .
- the embodiment may present the suggested, graphical emoticons 506 by displaying the graphical emoticons 506 in a callout box 508 positioned at or near the current position of the input cursor 516 .
- the embodiment may suggest the graphical emoticons 506 , which relate to cities.
- FIG. 6 depicts a user-interface 600 of a messaging application, where the messaging application utilizes an embodiment.
- a user may utilize the user interface 600 to receive and review messages received from other users over online chat, and to compose and transmit messages to other users over online chat
- the messaging application may a client on an instant messaging system, where the messaging application is operating on a digital device local to the user, such a smartphone computing device or a laptop.
- the instant messaging system may operate on another digital device such as a server, where the messaging application interfaces with the instant messaging system.
- the messaging application may operate on a digital device as a standalone application, or as an applet, plug-in, or script operating through a web browser.
- the user interface 600 of the messaging application may comprise a conversation pane 602 , a message input field 604 , an on-screen keyboard 606 , and a send button 616 .
- the conversation pane 602 may comprise messages submitted to the online chat, including messages submitted by the user through the user interface 600 .
- the user may submit messages to the online chat using the message input field 604 .
- the user may enter a message into the message input field 604 using the on-screen keyboard 606 , and may press the send button 616 when the user desires to submit the message to the online chat.
- the message input field 604 may comprise and may be configured to receive a message prepared by the user for submission to the online chat.
- the message input field 604 may receive one or more segments from the user through the on-screen keyboard 606 , or may receive one or more emoticons as selected through the on-screen keyboard 606 .
- the message input field 604 may further comprise an input cursor 610 .
- the on-screen keyboard 606 may comprise a QWERTY keyboard, a button 624 to hide the on-screen keyboard 606 from view (e.g., when not in use), and an emoticon menu 622 .
- the emoticon menu 622 may comprise emoticons from a default emoticon library, or a selection of emoticons suggested by the embodiment.
- a left select button 618 and a right select button 620 may allow the user to scroll and browse through the emoticons available for entry selection through the emoticon menu 622 .
- various embodiments may suggest emoticons for entry at the current position of the input cursor 610 .
- the embodiment may suggest a “football” graphical emoticon 614 for entry into the input field 604 based on the embodiment's analysis of the segment of interest 612 , which recites “football.”
- the embodiment may suggest the “football” graphical emoticon 614 based on an association between the “football” graphical emoticon 614 and the context of the segment of interest 612 .
- the embodiment may enter the “football” graphical emoticon 614 into the message input field 604 .
- the embodiment may suggest a plurality of “field goal” graphical emoticons based on the context analysis of the segment of interest 608 .
- the embodiment may present the “field goal” graphical emoticons for entry selection by displaying the graphical emoticons in the emoticon menu 622 , which may be displayed as part of the on-screen keyboard 606 .
- FIG. 7 is a block diagram of an exemplary digital device 700 .
- the digital device 700 comprises a processor 702 , a memory system 704 , a storage system 706 , a communication network interface 708 , an I/O interface 710 , and a display interface 712 communicatively coupled to a bus 714 .
- the processor 702 is configured to execute executable instructions (e.g., programs).
- the processor 702 comprises circuitry or any processor capable of processing the executable instructions.
- the memory system 704 is any memory configured to store data. Some examples of the memory system 704 are storage devices, such as RAM or ROM. The memory system 704 can comprise the ram cache. In various embodiments, data is stored within the memory system 704 . The data within the memory system 704 may be cleared or ultimately transferred to the storage system 706 .
- the storage system 706 is any non-transitory storage configured to retrieve and store data. Some examples of the storage system 706 are flash drives, hard drives, optical drives, and/or magnetic tape.
- the digital device 700 includes a memory system 704 in the form of RAM and a storage system 706 in the form of flash data. Both the memory system 704 and the storage system 706 comprise computer readable media which may store instructions or programs that are executable by a computer processor including the processor 702 .
- the communication network interface (com. network interface) 708 can be coupled to a network (e.g., communication network 110 ) via the link 716 .
- the communication network interface 708 may support communication over an Ethernet connection, a serial connection, a parallel connection, or an ATA connection, for example.
- the communication network interface 708 may also support wireless communication (e.g., 802.11 alb/gin, WiMax). It will be apparent to those skilled in the art that the communication network interface 708 can support many wired and wireless standards.
- the optional input/output (I/O) interface 710 is any device that receives input from the user and output data.
- the optional display interface 712 is any device that is configured to output graphics and data to a display. In one example, the display interface 712 is a graphics adapter. It will be appreciated that not all digital devices 700 comprise either the I/O interface 710 or the display interface 712 .
- a digital device 700 may comprise more or less hardware elements than those depicted. Further, hardware elements may share functionality and still be within various embodiments described herein.
- encoding and/or decoding may be performed by the processor 702 and/or a co-processor located on a GPU (Le., Nvidia).
- the above-described functions and components can be comprised of instructions that are stored on a storage medium such as a computer readable medium.
- the instructions can be retrieved and executed by a processor.
- Some examples of instructions are software, program code, and firmware.
- Some examples of storage medium are memory devices, tape, disks, integrated circuits, and servers.
- the instructions are operational when executed by the processor to direct the processor to operate in accordance with some embodiments. Those skilled in the art are familiar with instructions, processor(s), and storage medium.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- User Interface Of Digital Computer (AREA)
- Machine Translation (AREA)
- Input From Keyboards Or The Like (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Document Processing Apparatus (AREA)
Abstract
Description
- The invention(s) described herein generally relate to emoticons. More particularly, the invention(s) relate to systems and methods for identifying and suggesting emoticons during various activities on a computing device.
- Originally, emoticons were facial expressions represented by characters (e.g., ASCII characters) commonly found on computer keyboards, such as letters, numbers, and symbols. These original emoticons, once placed in an electronic message or an electronic posting by an author (e.g., electronic bulletin board), were meant to convey the author's mood or to convey/enhance the overall sentiment of the message or the posting. In beginning, these emoticons were limited to expressing moods, such as happiness, anger, sadness, and indifference. Gradually, however, the use of these character-based emoticons characters (hereafter, “character emoticons”) expanded to conveying meanings and messages.
- Eventually, emoticons expanded further in type, availability, and usage. Today, emoticons include character emoticons and emoticons represented by graphical images (hereafter, “graphical emoticons”). With the availability of graphical emoticons, a user can depict a greater number of moods, meanings and messages not once possible with character emoticons alone. Both character and graphical emoticons are now available for use through a variety of digital devices (e.g., mobile telecommunication devices, and tablets), and are used in a variety of computing activities, especially with respect to the Internet. For example, graphical emoticons are commonly available for use when drafting personal e-mails, when posting messages on the Internet (e.g., on social networking site or a web forum), and when messaging between mobile devices. Generally, as a user performs a computing activity applicable to emoticons, the user may access emoticons through a menu or library from which they can browse and select emoticons for use in the computing activity.
- Unfortunately, with the emergence of graphical emoticons, the number of emoticons a user can choose from has grown vastly. There are graphical emoticons available for almost every subject matter imaginable. Due to the expansion in number, usage, availability, and variety of emoticons, it can be quite time consuming, and sometimes overwhelming, for users to browse through and select appropriate emoticons for a given context when participating in emoticon-applicable computing activities.
- Various embodiments discussed herein provide systems and methods for identifying and suggesting emoticons for segments of texts. Some systems and methods may be utilized during a user activity on a computing device including, without limitation, instant messaging, participating in online chat rooms, drafting e-mails, posting web blogs, or posting to web forums.
- An exemplary method comprises receiving a set of segments from a text field, analyzing the set of segments to determine at least one of a target subtext or a target meaning associated with the set of segments, and identifying a set of candidate emoticons where each candidate emoticon in the set of candidate emoticons has an association between the candidate emoticon and at least one of the target subtext or the target meaning. The method may further comprise presenting the set of candidate emoticons for entry selection at a current position of an input cursor, receiving an entry selection for a set of selected emoticons from the set of candidate emoticons, and inserting the set of selected emoticons into the text field at the current position of the input cursor. The set of segments may comprise one or more segments of interest selected relative to a current position of an input cursor in the text field, the set of candidate emoticons may comprise one or more candidate emoticons, and the set of selected emoticons may comprise one or more selected emoticons. Depending on the embodiment, analyzing the set of segments may comprise semantic analysis of the set of segments.
- For some embodiments, each association may comprise a statistical usage of the candidate emoticon with at least one of the target subtext or the target meaning. Additionally, for some embodiments, the method may further comprise updating the statistical usage of the candidate emoticons based on the entry selection for the set of selected emoticons. Depending on the embodiment, the statistical usage may be based on usage by a single user or by a plurality of users.
- Presenting the set of emoticons for entry selection may involve displaying the emoticon, for entry selection, at or near the current position of the input cursor. Presenting the set of candidate emoticons for entry selection may comprise displaying the set of candidate emoticons, for entry selection, on a physical input device or a virtual input device (e.g., on-screen keyboard, or a projected keyboard), wherein the physical input device and the displayed input interface are configured to execute the entry selection. Depending on the embodiment, the virtual input device may be displayed by a display device that is also displaying the text field. Additionally, the virtual input device may be displayed in close proximity to the text field.
- In some embodiments, the method may further comprise identifying the set of segments using syntactical analysis. Each segment of interest may comprise at least one of a word, a sentence fragment, a sentence, a phrase, or a passage that precedes or follows a current position of an input cursor.
- In particular embodiments, identifying the set of candidate emoticons may be further based on at least a user preference, user-related information, or recipient-related information. The user-related information may include a user interest, a user ethnicity, a user religion, a user geographic location, a user age, a user relational status, and a user occupation. The recipient-related information may include a recipient's relation to a user, a recipient interest, a recipient ethnicity, a recipient religion, a recipient geographic location, a recipient age, a recipient relational status, and a recipient occupation.
- An exemplary system comprises a processor, a display module, an input module, a segment analysis module, an emoticon search module, an emoticon suggestion module, and an emoticon selection module. The display module may be configured to display a text field and one or more segments entered into the text field.
- The input module may be configured to receive segment input from a user and to enter the segment input into the text field at an input cursor. The segment analysis module may be configured to receive a set of segments from the text field, wherein the set of segments comprises one or more segments of interest selected relative to a current position of the input cursor in the text field. The segment analysis module may be further configured to use the processor to analyze the set of segments to determine at least one of a target subtext or a target meaning associated with the set of segments. The emoticon search module may be configured to identify a set of candidate emoticons, wherein each candidate emoticon in the set of candidate emoticons has an association between the candidate emoticon and at least one of the target subtext or the target meaning, and wherein the set of candidate emoticons comprises one or more candidate emoticons. The emoticon suggestion module may be configured to present the set of candidate emoticons through the display module for entry selection at the current position of the input cursor. The emoticon selection module may be configured to receive from the input module an entry selection for a set of selected emoticons from the set of candidate emoticons, wherein the set of selected emoticons comprises one or more selected emoticons. The emoticon selection module may be further configured to insert the set of selected emoticons into the text field at the current position of the input cursor.
- In some embodiments, the system further comprises an emoticon datastore comprising one or more emoticons capable of entry into the text field, and wherein the emoticon search module is further configured to identify a set of candidate emoticons on the emoticon datastore.
- In various embodiments, each association may comprise a statistical usage of the candidate emoticon with at least one of the target subtext or the target meaning, and the emoticon selection module may be further configured to update the statistical usage of the candidate emoticons based on the entry selection for the set of selected emoticons.
- In some embodiments, presenting the set of emoticons through the display module for entry selection may comprise displaying the emoticon, for entry selection, at or near the current position of the input cursor. The input module may comprise a physical input device or a virtual input device, wherein the physical input device and the virtual input interface are configured to execute the entry selection.
- Other features and aspects of some embodiments will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, which illustrate, by way of example, the features in accordance with various embodiments.
- Various embodiments are described in detail with reference to the following figures. The drawings are provided for purposes of illustration only and merely depict some example embodiments. These drawings are provided to facilitate the reader's understanding of the various embodiments and shall not be considered limiting of the breadth, scope, or applicability of embodiments.
-
FIG. 1 depicts an example of an environment in which various embodiments may be utilized. -
FIG. 2 is a block diagram of an exemplary emoticon suggestion system in accordance with some embodiments. -
FIG. 3 is a flow chart of an exemplary method for identifying and suggesting an emoticon in accordance with some embodiments. -
FIG. 4 is a block diagram of an exemplary emoticon suggesting system using a client-server architecture in accordance with some embodiments. -
FIG. 5 depicts a user interface of a messaging application, where the messaging application utilizes an embodiment. -
FIG. 6 depicts a user-interface of a messaging application, where the messaging application utilizes an embodiment. -
FIG. 7 is a block diagram of an exemplary digital device. - A number of embodiments described herein relate to systems and methods that identify and suggest emoticons during a variety of activities on a computing device involving typing characters into a text field. Various systems and methods may identify the emoticon by analyzing a context of segments present in the text field and identifying one or more candidate emoticons available for entry into the text field based on that context. Subsequently, the user may select one or more emoticons from the candidate emoticons and the selected emoticons may be entered into the text field. Optionally, the user could choose to ignore the emoticon suggestion(s) entirely, and continue with their activities on the computing device. As used in this description, a “segment” may comprise one or more characters that represent a word, a phrase, a sentence fragment, a sentence, or a passage.
- Depending on the embodiment, analysis of the context of segments present in the text field may involve determining a subtext or a meaning relating to those segments, which may require semantic analysis of those segments. Also, as described herein, the association between a particular candidate emoticon and a particular subtext or meaning may be based on (past) statistical usage of the particular candidate emoticon with the particular subtext or meaning. In various embodiments, such emoticon usage may be based on a user's personal usage of the particular emoticon with the particular subtext or meaning (e.g., user's selection of suggested emoticons in the particular subtext or meaning), or may be based on a community's usage of the particular emoticon with the particular subtext or meaning (e.g., observed usage of certain emoticons in postings on a social network by a community of users).
-
FIG. 1 depicts an example of anenvironment 100 in which various embodiments may be utilized. Theenvironment 100 comprises a tablet computing device 104, a local emoticon datastore 102 coupled to the tablet computing device 104, asmartphone computing device 108, a local emoticon datastore 106 coupled to thesmartphone computing device 108, a desktop computing device 112, a local emoticon datastore 114 coupled to the desktop computing device 112, anemoticon suggestion server 116, and a local emoticon datastore 118 coupled to theemoticon suggestion server 116. Theenvironment 100 further comprises acommunication network 110 over which the tablet computing device 104, thesmartphone computing device 108, the desktop computing device 112, and theemoticon suggestion server 116 communicate. The tablet computing device 104, thesmartphone computing device 108, the desktop computing device 112, and theemoticon suggestion server 116 are examples of digital devices having a processor and memory. Other exemplary digital devices with which various embodiments may be utilized include laptops, netbooks, notebooks, media devices, music devices personal digital assistants (PDAs), or the like. Exemplary digital devices are further described inFIG. 7 . - In accordance with some embodiments, the tablet computing device 104, the
smartphone computing device 108, and the desktop computing device 112 may be exemplary digital devices that utilize systems and methods for identifying and suggesting emoticons for entry. For instance, such computing devices may utilize certain embodiments to identify and suggest emoticons when a user is using an instant messaging application on such computing devices, or when the user is posting a message on a website forum through such computing devices. Those of ordinary skill in the art will appreciate that other digital devices could be utilized in conjunction with various embodiments described herein. - In some embodiments, the
emoticon suggestion server 116 may facilitate the identification and suggestion of an emoticon for a user at a digital device. As later described herein, theemoticon suggestion server 116 may determine the context of a segment, may identify one or more candidate emoticons based on a determined context, may suggest one or more candidate emoticons to a digital device, or may perform some combination thereof. For various embodiments, theemoticon suggestion server 116 may be a service operating on a server that hosts an Internet service, where theemoticon suggestion server 116 provides emoticon suggestion functionality to the Internet service. For instance, theemoticon suggestion server 116 may be a service operating on a web server that is hosting a website (e.g., a website forum or a social networking website) that is being serviced by the emoticon suggestion server 116 (i.e., that is being provided emoticon suggestions by the emoticon suggestion server 116). - Depending on the embodiment, various operations and components for identifying and suggesting an emoticon may be isolated to the digital device that utilizes the emoticon suggestions, or may be distributed on varying levels amongst two or more digital devices. For example, a system or method for identifying, suggesting, and entering an emoticon when drafting an e-mail on the
smartphone computing device 108 may be entirely embedded in an e-mail application that is stored and operated on thesmartphone computing device 108. In an alternative example, while using the tablet computing device 104 to prepare a message post for a website forum, a system or method for identifying, suggesting, and entering an emoticon may utilize the tablet computing device 104 to determine the context of the message as currently prepared, utilize theemoticon suggestion server 116 to identify one or more candidate emoticons for use in the message as currently prepared, and then utilize the tablet computing device 104 to present the candidate emoticons as suggested emoticons. - The
emoticon suggestion server 116 may utilize the remote emoticon datastore 118 during the identification and suggestion of emoticons to digital devices. For certain embodiments, the remote emoticon datastore 118 may comprise a library of emoticons available for suggestion by theemoticon suggestion server 116, and associations between emoticons in the library and contexts (e.g., subtexts and meanings). For example, the remote emoticon datastore 118 may comprise a library of “happy face” emoticons, and associations between the “happy face” emoticons and a happy context. In another example, the remote emoticon datastore 118 may comprise a library of “San Francisco” emoticons, and associations between the “San Francisco” emoticons and contexts that explicitly or implicitly refers to the city of San Francisco. For some embodiments, the remote emoticon datastore 118 may comprise two or more associations between a given emoticon and a given context (e.g., subtext or meaning). For example, the remote emoticon datastore 118 may comprise a library of “frowning face” emoticons, associations between the “frowning face” emoticons and a sad context, and associations between the “frowning face” emoticons and a displeased context. Those skilled in the art would appreciate that a variety of emoticon libraries and a variety of association between emoticons and contexts can be stored on theremote emoticon datastore 118. - Depending on the embodiment, the library of emoticons may comprise emoticons that are accessible by any user or accessible by a limited group of users restricted access (e.g., based on a premium, or only accessible to certain groups), user-customized or user-uploaded emoticons, or emoticons that are user favorites. In addition to character and graphical emoticons that convey a mood or emotion from an author, emoticons used in various embodiments may include those that relate to interests, hobbies, geographic locations, events, holidays, seasons, weather, and the like. Emoticons stored on the emoticon suggestion datastore 118 may include character emoticons, graphical emoticons, graphically animated emoticons, and emoticons accompanied by sound. For some embodiments, the remote emoticon datastore 118 may further comprise user preferences, user information or recipient information, which may be utilized the embodiments when identifying emoticons suitable for suggestion. For example, the remote emoticon datastore 118 may store a user preference that causes an embodiment to suggest user-defined or user-uploaded emoticons before suggesting emoticons generally available to any user. In another example, the remote emoticon datastore 118 may store a user preference that causes an embodiment to automatically insert the first emoticon suggested to the user by the embodiment, or to automatically insert the suggested emoticon having the highest usage in a given context.
- In some embodiments, the tablet computing device 104, the
smartphone computing device 108, and the desktop computing device 112 may each be coupled to a separate, local emoticon datastore capable of storing user-customized emoticons, a user's favorite or preferred emoticons, associations between emoticons stored on the local emoticon and contexts (e.g., subtext or meaning), user preferences with respect to identifying and suggestion emoticons, user-related information, or recipient-related information. For instance, the tablet computing device 104 may be coupled to the local emoticon datastore 102, thesmartphone computing device 108 may be coupled to the local emoticon datastore 106 coupled, and the desktop computing device 112 may be coupled to thelocal emoticon datastore 114. - Additionally, each of the
local emoticon datastores emoticon suggestions server 116 is queried for the suggested emoticons. For some embodiments, the emoticons cached in thelocal emoticon datastores emoticon suggestion server 116 for suggested emoticons. -
FIG. 2 is a block diagram of an exemplaryemoticon suggestion system 200 in accordance with some embodiments. Theemoticon suggestion system 200 may comprise adisplay module 202, aninput module 204, asegment analysis module 206, anemoticon search module 208, an emoticon suggestion module 210, an emoticon selection module 212, and anemoticon datastore 214. In some embodiments, theemoticon suggestion system 200 may further comprise memory and at least one processor, which facilitate operation of various modules contained in theemoticon suggestion system 200. - The
display module 202 may display an input field, such as a text field or a text box, into which a user can input one or more segments, character emoticons, or graphical emoticons using theinput module 204. Typically, as segments and emoticons are entered into the input field they appear in the input field. As noted herein, a “segment” may comprise one or more characters that represent a word, a phrase, a sentence fragment, a sentence, or a passage. As part of the input field, thedisplay module 202 may display an input cursor, which indicates where a user's character inputs will be next entered or where an emoticon may be next entered. - As noted herein, various embodiments may suggest emoticons based on the current position of the input cursor within the input field, the present segment content of the input, user-related information, recipient-related information, user preferences, or some combination thereof. Generally, once one or more candidate emoticons have been identified for suggestion to the user (e.g., based on the segment content of the input field), the candidate emoticons may be suggested to the user via the
display module 202. Specifically, thedisplay module 202 may, for the user's selection, display the candidate emoticons at or near the current position of the input cursor in the input field. Depending on the embodiment, thedisplay module 202 may display the candidate emoticons at or near the input field via a callout box. - For some embodiments, the
display module 202 may form part of a digital device (e.g., video display, or video projector) that may be responsible for displaying all graphical output from the digital device. In some embodiments, thedisplay module 202 may display the input field as part of a graphical user interface (GUI). For instance, the input field may be a graphical component of an application operating on a digital device (e.g., e-mail client, or an instant messaging application), or may be a graphical representation of a document viewable or editable through an application operating on the digital device (e.g., a text field of a web page shown through a web browser, or a document shown through a word processor). Those of ordinary skill in the art will appreciate that the input field may vary in type and size from embodiment to embodiment. - The
input module 204 may receive character input from a user and enter such character input into the input field as received. As character input is entered into the input field, thedisplay module 202 may update the input field with the character input. Additionally, theinput module 204 may further receive entry selections for emoticons suggested, in accordance with various embodiments. Generally, upon selection, the selected emoticons may be inserted at the current position of the input cursor in the input field. Depending on the embodiment, the input module may comprise a physical input device that is externally coupled to a digital device or that is physical embedded into the digital device. Examples of physical input devices can include, without limitation, keyboards, trackpads or computer mice. In some embodiments, the input module may comprise a virtual input device, such as a laser-projected keyboard or an on-screen keyboard, which may be provided (i.e., displayed) to the user through thedisplay module 202. In various embodiments, as virtual input devices are employed, such virtual input devices may be displayed at or near the input field to which segments will be inputted. - As further described in
FIG. 6 , in some embodiments, suggested emoticons may be presented to the user through theinput module 204. For example, whereinput module 204 comprises a physical keyboard, the physical keyboard may be configured to display suggested emoticons through the physical keyboard. For some embodiments, the physical keyboard may display suggested emoticons by way of keys or buttons that comprise embedded displays (e.g., LCD buttons), or by way of a display embedded on a surface of the physical keyboard (e.g., at the top of the keyboard). Depending on the embodiment, the suggested emoticons may be displayed through the physical keyboard in color or in grayscale. As the suggested emoticons are displayed through the physical keyboard, the user may select one or more of those suggested emoticons through keys or buttons of the physical keyboard. - In another example, where the
input module 204 comprises an on-screen keyboard (like those found on some tablet computing device and smartphone computing devices), the appearance of the on-screen keyboard may be reconfigured to display the suggested emoticons through the on-screen keyboard. For some embodiments, the appearance of the on-screen keyboard may be reconfigured so that certain buttons of the on-screen keyboard are replaced with suggested emoticons buttons, or so that the on-screen keyboard is augmented with additional suggested emoticon buttons. Once presented to through the on-screen keyboard, the suggested emoticon buttons may be used by a user to select from the one or more suggested emoticons. - The
segment analysis module 206 may analyze one or more segments present in the input field and determine a context for the segments analyzed. As described herein, the context determined by thesegment analysis module 206 may be subsequently utilized when identifying candidate emoticons to be suggested to the user. In various embodiments, thesegment analysis module 206 may analyze only segments of interest from the input field when determining the context of segments in the input field. - In some embodiments, the
segment analysis module 206 first identifies segments of interest in the input field, and then analyzes those segments of interest to determine a context. Generally, the segments of interest are identified in relation to a current position of an input cursor in the input field. Additionally for some embodiments, thesegment analysis module 206 may perform syntactical analysis of the segments currently present in the input field when identifying segments of interest. - Depending on the embodiment, the
segment analysis module 206 may identify the segments of interest based on conditional or non-conditional rules that guide the segment of interest identification process. An exemplary rule for identifying segments of interest may include identifying the sentence fragment or sentence immediately preceding the current position of the input cursor in the input field as a segment of interest. Another exemplary rule for identifying segments of interest may include identifying the sentence fragment or sentence immediately following the current position of the input cursor in the input field as a segment of interest. For some embodiments, the rules may be utilized in conjunction with the syntactical analysis performed by thesegment analysis module 206 to determine the segments of interest. - Where more than one segment of interest is identified, the
segment analysis module 206 may analyze the context of each of the segments of interest, or may analyze the context of all but the least important segments of interest (e.g., based on a weight system, where certain segments of interest are of higher importance than others). In addition, one or more rules may determine which of the segments of interests should be analyzed when two or more segments of interest are identified. - The
segment analysis module 206 may determine two or more contexts from the segments of interest. In such cases, theemoticon suggestion system 200 may search for candidate emoticons associated with all of the determined contexts, or may only search for candidate emoticons that match one or more of the most important contexts (e.g., determined based on rules). - To determine a context of one or more segments of interest, the
segment analysis module 206 may semantically analyze the segments of interest present in the input field. Those of skill in the art will appreciate that the semantic analysis of segments may be performed in accordance with one or more techniques known in the art. When analyzing the context of one or more segments of interest, thesegment analysis module 206 may determine a subtext or a meaning for the segments of interest. Based on the subtext or meaning identified for the segments of interest, theemoticon suggestion system 200 may identify one or more candidate emoticons for suggestion. The subtext of a segment of the interest may identify a mood or an emotion for that segment of interest. Example subtexts for segments of interest may include, without limitation, happiness, sadness, indifference, anger, resentment, contrition, or excitement. The meaning for segments of the interest may identify an explicit meaning for segments of interest. For example, where a segment of interest recites “I just got a new job!,” thesegment analysis module 206 may identify the meaning for the segment of interest as “new job.” - It should be noted that for some embodiments, the
segment analysis module 206 may identify and analyze segments of interest in at or near real-time as the user adds characters or emoticons to or removes characters or emoticons from the input field using theinput module 204. - The
emoticon search module 208 may search for one or more candidate emoticons based on an identified context (e.g., subtext or meaning) of a segment of interest. In some embodiments, theemoticon search module 208 may search the emoticon datastore 214 for emoticons associated with the one or more contexts identified by theemoticon suggestion system 200. As described herein, the emoticon datastore 214 may comprise emoticon available for entry into the input field, and associations between an emoticon and one or more contexts. - As noted herein, the association between a given emoticon and a given contexts may comprise statistical usage of the given emoticon with that given context. The strength of the association between the given emoticon and the given context may be based on such statistical usage. Additionally, the statistical usage may be based on the user's own usage of the given emoticon with the given content, or may be based on usage of the given emoticon with the given content by a community of users (e.g., usage of a given emoticon in a given context on a social networking website).
- Generally, the more usage of a given emoticon with a given context, the stronger the association between that given emoticon and that given context. For some embodiments, the strength of the association between an emoticon and a context may indicate the confidence in suggesting the emoticon for that context. The strength of the association may also be used to prioritize and present the one or more candidate emoticons from the highest strength to the lowest strength.
- In some embodiments, the search for one or more candidate emoticons by the emoticon
search engine module 208 may consider the strength of the association between the emoticon and the context. For example, the emoticonsearch engine module 208 may only identify an emoticon as a candidate emoticon if the strength of the association between the emoticon and the target context meets or exceeds a predetermined threshold. Additionally, the emoticonsearch engine module 208 may only identify an emoticon as a candidate emoticon when the strength of the association between the emoticon and the target context meets or exceeds a threshold relative to other, potential candidate emoticons. - As noted herein, in various embodiments, the
emoticon search module 208 may further narrow the search for candidate emoticons by utilizing user preferences, user-related information, recipient-related information, or some combination thereof. Examples of user preferences may include, without limitation, a preference to suggest user-customized emoticons before other types of emoticons, and a preference to ignore certain categories of emoticons (e.g., suggest only emoticons that are age appropriate with respect to the user). Example of recipient-related information may include, without limitation, a user interest, a user ethnicity, a user religion, a user geographic location, a user age, a user relational status, and a user occupation. Example of user-related information may include, without limitation, a recipient's relation to a user, a recipient interest, a recipient ethnicity, a recipient religion, a recipient geographic location, a recipient age, a recipient relational status, and a recipient occupation. In certain embodiments, when searching for one or more candidate emoticons, theemoticon search module 208 may further consider the input field's limitations in receiving character or graphical emoticons and search for candidate emoticons accordingly. - The emoticon suggestion module 210 may receive the one or more candidate emoticons located based on an identified context of a segment of interest, and present the one or more candidate emoticons to the user for selection. As noted herein, in some embodiments, the emoticon suggestion module 210 may use the
display module 202 to display for entry selection the one or more candidate emoticons at or near the current position of the input cursor in the input field. As also noted herein, in various embodiments, the emoticon suggestion module 210 may use theinput module 202 to display for entry selection the one or more candidate emoticons through a physical input device or a virtual input device. - The emoticon selection module 212 may receive from the user an entry selection for one or more candidate emoticons suggested to the user. In particular embodiments, the emoticon selection module 212 may receive the entry selection for the one or more candidate emoticons through the
input module 204, and the emoticon selection module 212 may enter the one or more selected emoticons into the input field. As noted herein, the emoticon selection module 212 may enter the one or more selected emoticons at the current position of the input cursor. For some embodiments, the emoticon selection module 212 may enter the one or more selected emoticons into the input field by replacing segments or segments of interest within the input field with the one or more selected emoticons. Additionally, some embodiments may enable the user to set the emoticon selection module 212 (e.g., using a user preference) such that the module 212 auto-selects suggested emoticons based on certain guidelines. For instance, the user may configure the emoticon selection module 212 such that the first suggested emoticon is selected when an emoticon suggestion is made. - In some embodiments, where associations between emoticons and contexts comprise statistical usage of such emoticons with such contexts, the emoticon selection module 212 may update the statistical usage information based on the entry selection received from the user. In particular, the emoticon selection module 212 may receive the entry selection of one or more candidate emoticons for a given context, and update the statistical usage information stored between the selected candidate emoticons and their respective contexts of usage. Depending on the embodiment, the emoticon selection module 212 may update the statistical usage information on the
emoticon datastore 214. - The emoticon datastore 214 may comprise a library of emoticons available for suggestion by the
emoticon suggestion system 200, and associations between emoticons in the library and contexts (e.g., subtexts and meanings). Theemoticon search module 208 may access the emoticon datastore 214 when searching for one or more candidate emoticons that are associated with one or more particular contexts. As noted herein, for some embodiments, the emoticon datastore 214 may comprise two or more associations between a given emoticon and a given context (e.g., subtext or meaning). Additionally, the association between a given emoticon and a given context may comprise statistical usage of the given emoticon with the given context. Such statistical usage may reflect the strength of the association between the emoticon and the context. - Emoticons stored on the emoticon datastore 214 may include character emoticons, graphical emoticons, graphically animated emoticons, and emoticons accompanied by sound. For some embodiments, the emoticon datastore 214 may further comprise user preferences, user information or recipient information, which may be utilized the embodiments when identifying emoticons suitable for suggestion. For example, the emoticon datastore 214 may store a user preference that causes an embodiment to suggest user-defined or user-uploaded emoticons before suggesting emoticons generally available to any user. In another example, the emoticon datastore 214 may store a user preference that causes an embodiment to automatically insert the first emoticon suggested to the user by the embodiment, or to automatically insert the suggested emoticon having the highest usage in a given context.
- Those skilled in the art would appreciate that a variety of emoticon libraries and a variety of association between emoticons and contexts may be stored on the
emoticon datastore 214. - It will be appreciated that a “module” may comprise software, hardware, firmware, and/or circuitry. In one example one or more software programs comprising instructions capable of being executable by a processor may perform one or more of the functions of the modules described herein. In another example, circuitry may perform the same or similar functions. Alternative embodiments may comprise more, less, or functionally equivalent modules and still be within the scope of present embodiments. For example, the functions of the various modules may be combined or divided differently. For example, the functions of various modules may be distributed amongst one or more modules residing at an emoticon suggestion server and one or more modules reside at an emoticon suggestion client.
-
FIG. 3 is a flow chart of anexemplary method 300 for identifying and suggesting an emoticon in accordance with some embodiments. Instep 302, thesegment analysis module 206 may receive one or more segments from an input field, which may be displayed through thedisplay module 202. As noted herein, upon receiving the one or more segments, thesegment analysis module 206 may identify segments of interest for context analysis purposes. - In
step 304, thesegment analysis module 206 may analyze the one or more segments to determine one or more target subtexts or one or more target meanings of the segments. The target subtexts and the target meanings of the segments provide for one or more contexts associated with the segments. Depending on the embodiment, thesegment analysis module 206 may analyze only those segments which have been identified as segments of interest by thesegment analysis module 206. - In
step 306, theemoticon search module 208 may identify one or more candidate emoticons having an association with the one or more target contexts or one or more target meanings, which may have been determined by thesegment analysis module 206. In some embodiments, theemoticon search module 208 may identify one or more candidate emoticons in the emoticon datastore 214 which have an association with the target subtexts or the target meanings. As noted herein, the strength of each association may be based on statistical usage of a given emoticon with a given context, and such the strength may be taken into consideration as theemoticon search module 208 identifies one or more candidate emoticons. - In
step 308, the emoticon suggestion module 210 may present the one or more candidate emoticons to a user for entry selection at a current position of an input cursor in an input field. As described herein, the input field and the input cursor therein may be displayed to the user through thedisplay module 202. For some embodiments, the emoticon suggestion module 210 may present the one or more candidate emoticons to the user for entry selection usingdisplay module 202, and may display the candidate emoticons at or near the current position of the input cursor in the input field. Additionally, the emoticon suggestion module 210 may present the one or more candidate emoticons to the user for entry selection through one or more input devices of theinput module 204. For example, the emoticon suggestion module 210 may present the one or more candidate emoticons to the user through a physical input device, such as a physical keyboard having a display, or through a virtual input device, such as an on-screen keyboard. - In
step 310, the emoticon selection module 212 may receive an entry selection from the user for one or more select emoticons from the one or more candidate emoticons. For some embodiments, the emoticon selection module 212 may receive the entry selection from theinput module 204. Additionally, upon receiving the entry selection, the emoticon selection module 212 may update the statistical usage information on the emoticon datastore 214 for the one or more candidate emoticons based on the entry selection, thereby strengthen or weakening the association between the candidate emoticons and particular contexts. - In
step 312, based on the entry selection, the emoticon selection module 212 may insert the one or more candidate emoticons into the text field at the current position of the input cursor received by the emoticon selection module 212. As noted herein, in some embodiments, entry of the candidate emoticons into the input field may involve replacing one or more segments in the input field with the selected emoticons. -
FIG. 4 is a block diagram of an exemplaryemoticon suggesting system 400 using a client-server architecture in accordance with some embodiments. Theemoticon suggesting system 400 may comprise anemoticon suggestion client 402 and anemoticon suggestion server 420. In some embodiments, theemoticon suggestion client 402 may be similar to the digital device described inFIG. 7 , or to the computing devices described inFIG. 1 (i.e., tablet computing device 104, thesmartphone computing device 108, and the desktop computing device 112), and theemoticon suggestion server 420 may be similar to the digital device described inFIG. 7 , or to theemoticon suggestion server 116 described inFIG. 1 . Theemoticon suggestion client 402 and theemoticon suggestion server 420 may communicate with one another over acommunication network 418. - The
emoticon suggestion client 402 may comprise adisplay module 404, aninput module 406, asegment analysis module 408, anemoticon search module 410, a local emoticon datastore 412, anemoticon suggestion module 414, and anemoticon selection module 416. Theemoticon suggestion server 420 may comprise anemoticon search engine 422, and a remote emoticon datastore 424. - At the
emoticon suggestion client 402, thedisplay module 404 may display an input field into which a user can input one or more segments, character emoticons, or graphical emoticons using theinput module 406. Typically, as segments and emoticons are entered into the input field they appear in the input field. With the input field, thedisplay module 404 may display an input cursor in the input field, where the input cursor indicates where a user's character inputs will be next entered or where an emoticon may be next entered. - Various embodiments may suggest emoticons based a number of factors including, for example, the current position of the input cursor within the input field, the present segment content of the input, user-related information, recipient-related information, user preferences, or some combination thereof. The candidate emoticons, once identified, may be suggested to the user via the
display module 404. Specifically, thedisplay module 404 may, for the user's selection, display the candidate emoticons at or near the current position of the input cursor in the input field. Depending on the embodiment, thedisplay module 404 may display the candidate emoticons at or near the input field via a callout box. - Through the
display module 404, a digital device, may display all graphical output from the digital device. In some embodiments, thedisplay module 404 may display the input field as part of a graphical user interface (GUI). Depending on the embodiment, the input field may be a graphical component of an application operating on a digital device, or may be a graphical representation of a document viewable or editable through an application operating on the digital device. It will be appreciated by those of ordinary skill in the art that the input field may vary in type and size from embodiment to embodiment. - The
input module 406 may receive character input from a user and enter such character input into the input field as received. As character input is entered into the input field, thedisplay module 404 may update the input field with the character input. Additionally, theinput module 406 may further receive entry selections for emoticons suggested in accordance with various embodiments. Generally, upon selection, the selected emoticons may be inserted at the current position of the input cursor in the input field. As noted herein, the input module may comprise a physical input device that is externally coupled to a digital device or that is physical embedded into the digital device, or a virtual input device, such as an on-screen keyboard, which may be provided to the user through thedisplay module 404. In various embodiments, as virtual input devices are employed, such virtual input devices may be displayed at or near the input field to which segments will be inputted. - For some embodiments, suggested emoticons may be presented to the user through the
input module 406. For example, whereinput module 406 comprises a physical keyboard, the physical keyboard may be configured to display suggested emoticons through the physical keyboard. For some embodiments, the physical keyboard may display suggested emoticons by way of keys or buttons that comprise embedded displays (e.g., LCD buttons), or by way of a display embedded on a surface of the physical keyboard (e.g., at the top of the keyboard). The suggested emoticons may be displayed through the physical keyboard in color or in grayscale. As the suggested emoticons are displayed through the physical keyboard, the user may select one or more of those suggested emoticons through keys or buttons of the physical keyboard. - In some embodiments, where the
input module 406 comprises an on-screen keyboard, the appearance of the on-screen keyboard may be reconfigured to display the suggested emoticons through the on-screen keyboard. For example, the appearance of the on-screen keyboard may be reconfigured so that certain buttons of the on-screen keyboard are replaced with suggested emoticons buttons, or so that the on-screen keyboard is augmented with additional suggested emoticon buttons. Once presented to through the on-screen keyboard, the suggested emoticon buttons may be used by a user to select from the one or more suggested emoticons. - The
segment analysis module 408 may analyze one or more segments present in the input field and determine a context for the segments analyzed. As described herein, the context determined by thesegment analysis module 408 may be subsequently utilized when identifying candidate emoticons to be suggested to the user. In various embodiments, thesegment analysis module 408 may first identify segments of interest in the input field and then only analyze those segments of interest when determining the context of segments in the input field. - In some embodiments, the
segment analysis module 408 may perform syntactical analysis of the segments currently present in the input field when identifying segments of interest. Additionally, thesegment analysis module 408 may identify the segments of interest based on conditional or non-conditional rules that guide the segment of interest identification process. - To determine a context of one or more segments of interest, the
segment analysis module 408 may semantically analyze the segments of interest present in the input field. When analyzing the context of one or more segments of interest, thesegment analysis module 408 may determine a subtext or a meaning of the segments of interest. The subtext of a segment of the interest may identify a mood or an emotion for that segment of interest. Based on the subtext or meaning identified for the segments of interest, theemoticon suggestion system 400 may identify one or more candidate emoticons for suggestion. - It should be noted that for some embodiments, the
segment analysis module 408 may identify and analyze segments of interest in at or near real-time as the user adds characters or emoticons to or removes characters or emoticons from the input field using theinput module 408. - The
emoticon search module 410 may search for one or more candidate emoticons based on an identified context (e.g., subtext or meaning) of a segment of interest. In some embodiments, theemoticon search module 410 may access the local emoticon datastore 412 when searching for one or more candidate emoticons that are associated with one or more particular contexts. - Depending on the embodiment, the local emoticon datastore 412 may store user-customized emoticons, a user's favorite or preferred emoticons, associations between emoticons stored on the local emoticon and contexts (e.g., subtext or meaning), user preferences with respect to identifying and suggestion emoticons, user-related information, or recipient-related information. Additionally, local emoticon datastore 412 may be utilized to locally cache previously suggested emoticons or suggested emoticons previously selected by the user.
- In some embodiments, the
emoticon search module 410 may utilize theemoticon suggestion server 420 to search for and provide candidate emoticons to theemoticon suggestion client 402. For example, theemoticon suggestion server 420 may search for candidate emoticons on the remote emoticon datastore 424 and provide resulting candidate emoticons to theemoticon search module 410 on theemoticon suggestion client 402. Theemoticon suggestion server 420 may use theemoticon search engine 422 to search for candidate emoticons on the remote emoticon datastore 424, to retrieve candidate emoticons from the remote emoticon datastore 424, and to provide the candidate emoticons to theemoticon search module 410. - The remote emoticon datastore 424 may comprise a library of emoticons available for suggestion to the
emoticon suggestion client 402. The remote emoticon datastore 424 may further comprise associations between emoticons in the library and contexts. For certain embodiments, the associations comprise statistical usage of the given emoticon of the emoticons in the library with the context. Generally, such statistical usage may reflect the strength of the association between the emoticon and the context. - As noted herein, emoticons stored on the remote emoticon datastore 424 may include character emoticons, graphical emoticons, graphically animated emoticons, and emoticons accompanied by sound. For some embodiments, the remote emoticon datastore 424 may further comprise user preferences, user information or recipient information, which may be utilized the embodiments when identifying emoticons suitable for suggestion. Those skilled in the art would appreciate that a variety of emoticon libraries and a variety of association between emoticons and contexts can be stored on the remote emoticon datastore 424.
- The
emoticon suggestion module 414 may receive the one or more candidate emoticons located based on an identified context of a segment of interest, and present the one or more candidate emoticons to the user for selection. As noted herein, in some embodiments, theemoticon suggestion module 414 may use thedisplay module 414 to display for entry selection the one or more candidate emoticons at or near the current position of the input cursor in the input field. As also noted herein, in various embodiments, theemoticon suggestion module 414 may use theinput module 406 to display for entry selection the one or more candidate emoticons through a physical input device or a virtual input device. - The
emoticon selection module 416 may receive from the user an entry selection for one or more candidate emoticons suggested to the user. In particular embodiments, theemoticon selection module 416 may receive the entry selection for the one or more candidate emoticons through theinput module 404, and theemoticon selection module 416 may enter the one or more selected emoticons into the input field. Theemoticon selection module 416 may enter the one or more selected emoticons at the current position of the input cursor. Additionally, theemoticon selection module 416 may enter the one or more selected emoticons into the input field by replacing segments or segments of interest within the input field with the one or more selected emoticons. Some embodiments may enable the user to set the emoticon selection module 416 (e.g., using a user preference) such that themodule 416 auto-selects suggested emoticons based on certain guidelines. For instance, the user may configure theemoticon selection module 416 such that the first suggested emoticon is selected when an emoticon suggestion is made. - In some embodiments, where associations between emoticons and contexts comprise statistical usage of such emoticons with such contexts, the
emoticon selection module 416 may update the statistical usage information based on the entry selection received from the user. In particular, theemoticon selection module 416 may receive the entry selection of one or more candidate emoticons for a given context, and update the statistical usage information stored between the selected candidate emoticons and their respective contexts of usage. - Depending on the embodiment, the
emoticon selection module 416 may update the statistical usage information on the local emoticon datastore 412 or on the remote emoticon datastore 424. For example, if the one or more candidate emoticons selected through theemoticon selection module 416 were provided from theemoticon suggestion server 420, the statistical usage information for those candidate emoticons will be updated on the remote emoticon datastore 424. In another example, if the one or more candidate emoticons selected through theemoticon selection module 416 were provided from the local emoticon datastore 412, the statistical usage information for those candidate emoticons will be updated on the locateemoticon datastore 412. -
FIG. 5 depicts a user-interface 500 of a messaging application, where the messaging application utilizes an embodiment. In some embodiments, a user may utilize theuser interface 500 to receive and review messages received from other users over online chat, and to compose and transmit messages to other users over online chat. The messaging application may a client on an instant messaging system, where the messaging application is operating on a digital device local to the user, such a smartphone computing device or a laptop. The instant messaging system may operate on another digital device such as a server, where the messaging application interfaces with the instant messaging system. Depending on the embodiment, the messaging application may operate on a digital device as a standalone application, or as an applet, plug-in, or script operating through a web browser. - The
user interface 500 of the messaging application may comprise aconversation pane 502, amessage input field 504, and asend button 514. For some embodiments, theconversation pane 502 may comprise messages submitted to the online chat. As such, theconversation pane 502 may include messages submitted to the online chat from others, and messages submitted by the user through theuser interface 500. The user may submit messages to the online chat using themessage input field 504. In particular, the user may enter a message into themessage input field 504 and press thesend button 514 when the user desires to submit the message to the online chat. - The
message input field 504 may comprise and may be configured to receive a message prepared by the user for submission to the online chat. Themessage input field 504 may receive one or more segments from the user, or may receive one or more emoticons entered in accordance with some embodiments. Themessage input field 504 may further comprise aninput cursor 516. - As the user prepares a message in the
message input field 504, various embodiments may suggest emoticons for entry at the current position of theinput cursor 516. For example, as the user writes “It was so much fun” in themessage input field 504, the embodiment may suggest a “smiley face”graphical emoticon 510 for entry into theinput field 504 based on the embodiment's analysis of the segment ofinterest 512, which recites “so much fun.” The embodiment may suggest the “smiley face”graphical emoticon 510 based on an association between the “smiley face”graphical emoticon 510 and the context of the segment ofinterest 512. Once the user selects the “smiley face”graphical emoticon 510, the embodiment may enter the “smiley face”graphical emoticon 510 into themessage input field 504. - Likewise, as the user writes “Thanks again for showing me around your city” in the
message input field 504, the embodiment may suggest a plurality ofgraphical emoticons 506 based on the context analysis of the segment ofinterest 514. As noted herein, the embodiment may present the suggested,graphical emoticons 506 by displaying thegraphical emoticons 506 in acallout box 508 positioned at or near the current position of theinput cursor 516. Based on analysis of the segment ofinterest 514, which recites “your city,” the embodiment may suggest thegraphical emoticons 506, which relate to cities. -
FIG. 6 depicts a user-interface 600 of a messaging application, where the messaging application utilizes an embodiment. Like inFIG. 5 , a user may utilize theuser interface 600 to receive and review messages received from other users over online chat, and to compose and transmit messages to other users over online chat The messaging application may a client on an instant messaging system, where the messaging application is operating on a digital device local to the user, such a smartphone computing device or a laptop. The instant messaging system may operate on another digital device such as a server, where the messaging application interfaces with the instant messaging system. Depending on the embodiment, the messaging application may operate on a digital device as a standalone application, or as an applet, plug-in, or script operating through a web browser. - The
user interface 600 of the messaging application may comprise aconversation pane 602, amessage input field 604, an on-screen keyboard 606, and asend button 616. For some embodiments, theconversation pane 602 may comprise messages submitted to the online chat, including messages submitted by the user through theuser interface 600. The user may submit messages to the online chat using themessage input field 604. Specifically, the user may enter a message into themessage input field 604 using the on-screen keyboard 606, and may press thesend button 616 when the user desires to submit the message to the online chat. - The
message input field 604 may comprise and may be configured to receive a message prepared by the user for submission to the online chat. Themessage input field 604 may receive one or more segments from the user through the on-screen keyboard 606, or may receive one or more emoticons as selected through the on-screen keyboard 606. Themessage input field 604 may further comprise aninput cursor 610. - The on-
screen keyboard 606 may comprise a QWERTY keyboard, abutton 624 to hide the on-screen keyboard 606 from view (e.g., when not in use), and anemoticon menu 622. Through theemoticon menu 622, the user may select one or more emoticons for entry into themessage input field 604 at the current position of theinput cursor 610. Theemoticon menu 622 may comprise emoticons from a default emoticon library, or a selection of emoticons suggested by the embodiment. A leftselect button 618 and a rightselect button 620 may allow the user to scroll and browse through the emoticons available for entry selection through theemoticon menu 622. - As the user prepares a message in the
message input field 604, various embodiments may suggest emoticons for entry at the current position of theinput cursor 610. For instance, as the user writes “I never would have thought the football” in themessage input field 604, the embodiment may suggest a “football”graphical emoticon 614 for entry into theinput field 604 based on the embodiment's analysis of the segment ofinterest 612, which recites “football.” The embodiment may suggest the “football”graphical emoticon 614 based on an association between the “football”graphical emoticon 614 and the context of the segment ofinterest 612. Once the user selects the “football”graphical emoticon 614, the embodiment may enter the “football”graphical emoticon 614 into themessage input field 604. - Similarly, as the user writes “The 50-yard field goal” in the
message input field 604, the embodiment may suggest a plurality of “field goal” graphical emoticons based on the context analysis of the segment ofinterest 608. In particular, the embodiment may present the “field goal” graphical emoticons for entry selection by displaying the graphical emoticons in theemoticon menu 622, which may be displayed as part of the on-screen keyboard 606. -
FIG. 7 is a block diagram of an exemplarydigital device 700. Thedigital device 700 comprises aprocessor 702, amemory system 704, astorage system 706, acommunication network interface 708, an I/O interface 710, and adisplay interface 712 communicatively coupled to abus 714. Theprocessor 702 is configured to execute executable instructions (e.g., programs). In some embodiments, theprocessor 702 comprises circuitry or any processor capable of processing the executable instructions. - The
memory system 704 is any memory configured to store data. Some examples of thememory system 704 are storage devices, such as RAM or ROM. Thememory system 704 can comprise the ram cache. In various embodiments, data is stored within thememory system 704. The data within thememory system 704 may be cleared or ultimately transferred to thestorage system 706. - The
storage system 706 is any non-transitory storage configured to retrieve and store data. Some examples of thestorage system 706 are flash drives, hard drives, optical drives, and/or magnetic tape. In some embodiments, thedigital device 700 includes amemory system 704 in the form of RAM and astorage system 706 in the form of flash data. Both thememory system 704 and thestorage system 706 comprise computer readable media which may store instructions or programs that are executable by a computer processor including theprocessor 702. - The communication network interface (com. network interface) 708 can be coupled to a network (e.g., communication network 110) via the
link 716. Thecommunication network interface 708 may support communication over an Ethernet connection, a serial connection, a parallel connection, or an ATA connection, for example. Thecommunication network interface 708 may also support wireless communication (e.g., 802.11 alb/gin, WiMax). It will be apparent to those skilled in the art that thecommunication network interface 708 can support many wired and wireless standards. - The optional input/output (I/O)
interface 710 is any device that receives input from the user and output data. Theoptional display interface 712 is any device that is configured to output graphics and data to a display. In one example, thedisplay interface 712 is a graphics adapter. It will be appreciated that not alldigital devices 700 comprise either the I/O interface 710 or thedisplay interface 712. - It will be appreciated by those skilled in the art that the hardware elements of the
digital device 700 are not limited to those depicted inFIG. 7 . Adigital device 700 may comprise more or less hardware elements than those depicted. Further, hardware elements may share functionality and still be within various embodiments described herein. In one example, encoding and/or decoding may be performed by theprocessor 702 and/or a co-processor located on a GPU (Le., Nvidia). - The above-described functions and components can be comprised of instructions that are stored on a storage medium such as a computer readable medium. The instructions can be retrieved and executed by a processor. Some examples of instructions are software, program code, and firmware. Some examples of storage medium are memory devices, tape, disks, integrated circuits, and servers. The instructions are operational when executed by the processor to direct the processor to operate in accordance with some embodiments. Those skilled in the art are familiar with instructions, processor(s), and storage medium.
- Various embodiments are described herein as examples. It will be apparent to those skilled in the art that various modifications may be made and other embodiments can be used without departing from the broader scope of the present invention(s). Therefore, these and other variations upon the exemplary embodiments are intended to be covered by the present invention(s).
Claims (27)
Priority Applications (33)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/330,357 US20130159919A1 (en) | 2011-12-19 | 2011-12-19 | Systems and Methods for Identifying and Suggesting Emoticons |
NZ713912A NZ713912B2 (en) | 2011-12-19 | 2012-12-19 | Systems and methods for identifying and suggesting emoticons |
EP18000201.6A EP3352092A1 (en) | 2011-12-19 | 2012-12-19 | Systems and methods for identifying and suggesting emoticons |
CN201280068550.6A CN104335607A (en) | 2011-12-19 | 2012-12-19 | Systems and methods for identifying and suggesting emoticons |
KR1020147020236A KR20140105841A (en) | 2011-12-19 | 2012-12-19 | Systems and methods for identifying and suggesting emoticons |
NZ717653A NZ717653A (en) | 2011-12-19 | 2012-12-19 | Systems and methods for identifying and suggesting emoticons |
NZ713913A NZ713913B2 (en) | 2011-12-19 | 2012-12-19 | Systems and methods for identifying and suggesting emoticons |
EP12858827.4A EP2795441B1 (en) | 2011-12-19 | 2012-12-19 | Systems and methods for identifying and suggesting emoticons |
CA2859811A CA2859811A1 (en) | 2011-12-19 | 2012-12-19 | Systems and methods for identifying and suggesting emoticons |
MYPI2014001804A MY170666A (en) | 2011-12-19 | 2012-12-19 | Systems and methods for identifying and suggesting emoticons |
MYPI2018000076A MY189954A (en) | 2011-12-19 | 2012-12-19 | Systems and methods for identifying and suggesting emoticons |
MX2014007479A MX2014007479A (en) | 2011-12-19 | 2012-12-19 | Systems and methods for identifying and suggesting emoticons. |
PCT/US2012/070677 WO2013096482A2 (en) | 2011-12-19 | 2012-12-19 | Systems and methods for identifying and suggesting emoticons |
AU2012358964A AU2012358964B2 (en) | 2011-12-19 | 2012-12-19 | Systems and methods for identifying and suggesting emoticons |
NZ627285A NZ627285A (en) | 2011-12-19 | 2012-12-19 | Systems and methods for identifying and suggesting emoticons |
BR112014015219A BR112014015219A2 (en) | 2011-12-19 | 2012-12-19 | systems and methods for identifying and suggesting emoticons |
JP2014548845A JP6254534B2 (en) | 2011-12-19 | 2012-12-19 | System and method for identifying and proposing emoticons and computer program |
RU2014129554A RU2014129554A (en) | 2011-12-19 | 2012-12-19 | SYSTEMS AND METHODS FOR IDENTIFICATION AND OFFERS OF MOOD SIGNS |
SG11201403373XA SG11201403373XA (en) | 2011-12-19 | 2012-12-19 | Systems and methods for identifying and suggesting emoticons |
SG10201707573WA SG10201707573WA (en) | 2011-12-19 | 2012-12-19 | Systems and methods for identifying and suggesting emoticons |
US14/243,042 US8909513B2 (en) | 2011-12-19 | 2014-04-02 | Systems and methods for identifying and suggesting emoticons |
ZA2014/05207A ZA201405207B (en) | 2011-12-19 | 2014-07-16 | Systems and methods for identifying and suggesting emoticons |
US14/563,004 US9075794B2 (en) | 2011-12-19 | 2014-12-08 | Systems and methods for identifying and suggesting emoticons |
HK15103701.7A HK1203238A1 (en) | 2011-12-19 | 2015-04-15 | Systems and methods for identifying and suggesting emoticons |
US14/733,112 US9244907B2 (en) | 2011-12-19 | 2015-06-08 | Systems and methods for identifying and suggesting emoticons |
US14/976,925 US10254917B2 (en) | 2011-12-19 | 2015-12-21 | Systems and methods for identifying and suggesting emoticons |
AU2016204020A AU2016204020B2 (en) | 2011-12-19 | 2016-06-15 | Systems and methods for identifying and suggesting emoticons |
AU2016206331A AU2016206331B1 (en) | 2011-12-19 | 2016-07-21 | Systems and methods for identifying and suggesting emoticons |
AU2016253602A AU2016253602B2 (en) | 2011-12-19 | 2016-11-02 | Systems and methods for identifying and suggesting emoticons |
JP2017230299A JP6563465B2 (en) | 2011-12-19 | 2017-11-30 | System and method for identifying and proposing emoticons |
AU2019200788A AU2019200788A1 (en) | 2011-12-19 | 2019-02-06 | Systems and methods for identifying and suggesting emoticons |
US16/282,447 US20190187879A1 (en) | 2011-12-19 | 2019-02-22 | Systems and methods for identifying and suggesting emoticons |
JP2019136207A JP2019207726A (en) | 2011-12-19 | 2019-07-24 | Systems and methods for identifying and suggesting emoticons |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/330,357 US20130159919A1 (en) | 2011-12-19 | 2011-12-19 | Systems and Methods for Identifying and Suggesting Emoticons |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/243,042 Continuation US8909513B2 (en) | 2011-12-19 | 2014-04-02 | Systems and methods for identifying and suggesting emoticons |
Publications (1)
Publication Number | Publication Date |
---|---|
US20130159919A1 true US20130159919A1 (en) | 2013-06-20 |
Family
ID=48611572
Family Applications (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/330,357 Abandoned US20130159919A1 (en) | 2011-12-19 | 2011-12-19 | Systems and Methods for Identifying and Suggesting Emoticons |
US14/243,042 Expired - Fee Related US8909513B2 (en) | 2011-12-19 | 2014-04-02 | Systems and methods for identifying and suggesting emoticons |
US14/563,004 Expired - Fee Related US9075794B2 (en) | 2011-12-19 | 2014-12-08 | Systems and methods for identifying and suggesting emoticons |
US14/733,112 Expired - Fee Related US9244907B2 (en) | 2011-12-19 | 2015-06-08 | Systems and methods for identifying and suggesting emoticons |
US14/976,925 Expired - Fee Related US10254917B2 (en) | 2011-12-19 | 2015-12-21 | Systems and methods for identifying and suggesting emoticons |
US16/282,447 Abandoned US20190187879A1 (en) | 2011-12-19 | 2019-02-22 | Systems and methods for identifying and suggesting emoticons |
Family Applications After (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/243,042 Expired - Fee Related US8909513B2 (en) | 2011-12-19 | 2014-04-02 | Systems and methods for identifying and suggesting emoticons |
US14/563,004 Expired - Fee Related US9075794B2 (en) | 2011-12-19 | 2014-12-08 | Systems and methods for identifying and suggesting emoticons |
US14/733,112 Expired - Fee Related US9244907B2 (en) | 2011-12-19 | 2015-06-08 | Systems and methods for identifying and suggesting emoticons |
US14/976,925 Expired - Fee Related US10254917B2 (en) | 2011-12-19 | 2015-12-21 | Systems and methods for identifying and suggesting emoticons |
US16/282,447 Abandoned US20190187879A1 (en) | 2011-12-19 | 2019-02-22 | Systems and methods for identifying and suggesting emoticons |
Country Status (16)
Country | Link |
---|---|
US (6) | US20130159919A1 (en) |
EP (2) | EP3352092A1 (en) |
JP (3) | JP6254534B2 (en) |
KR (1) | KR20140105841A (en) |
CN (1) | CN104335607A (en) |
AU (5) | AU2012358964B2 (en) |
BR (1) | BR112014015219A2 (en) |
CA (1) | CA2859811A1 (en) |
HK (1) | HK1203238A1 (en) |
MX (1) | MX2014007479A (en) |
MY (2) | MY170666A (en) |
NZ (2) | NZ717653A (en) |
RU (1) | RU2014129554A (en) |
SG (2) | SG11201403373XA (en) |
WO (1) | WO2013096482A2 (en) |
ZA (1) | ZA201405207B (en) |
Cited By (214)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130307779A1 (en) * | 2012-05-17 | 2013-11-21 | Bad Donkey Social, LLC | Systems, methods, and devices for electronic communication |
US20130332308A1 (en) * | 2011-11-21 | 2013-12-12 | Facebook, Inc. | Method for recommending a gift to a sender |
US20140019878A1 (en) * | 2012-07-12 | 2014-01-16 | KamaGames Ltd. | System and method for reflecting player emotional state in an in-game character |
US20140092130A1 (en) * | 2012-09-28 | 2014-04-03 | Glen J. Anderson | Selectively augmenting communications transmitted by a communication device |
US20140214409A1 (en) * | 2011-12-19 | 2014-07-31 | Machine Zone, Inc | Systems and Methods for Identifying and Suggesting Emoticons |
US20140282212A1 (en) * | 2013-03-15 | 2014-09-18 | Gary Shuster | System, method, and apparatus that facilitates modifying a textual input |
US20140324414A1 (en) * | 2013-04-28 | 2014-10-30 | Tencent Technology (Shenzhen) Company Limited | Method and apparatus for displaying emoticon |
CN104333688A (en) * | 2013-12-03 | 2015-02-04 | 广州三星通信技术研究有限公司 | Equipment and method for generating emoticon based on shot image |
WO2015050910A1 (en) * | 2013-10-03 | 2015-04-09 | Microsoft Corporation | Emoji for text predictions |
US20150113439A1 (en) * | 2012-06-25 | 2015-04-23 | Konami Digital Entertainment Co., Ltd. | Message-browsing system, server, terminal device, control method, and recording medium |
US20150121290A1 (en) * | 2012-06-29 | 2015-04-30 | Microsoft Corporation | Semantic Lexicon-Based Input Method Editor |
WO2015061700A1 (en) * | 2013-10-24 | 2015-04-30 | Tapz Communications, LLC | System for effectively communicating concepts |
WO2015066610A1 (en) * | 2013-11-04 | 2015-05-07 | Meemo, Llc | Word recognition and ideograph or in-app advertising system |
US9043196B1 (en) | 2014-07-07 | 2015-05-26 | Machine Zone, Inc. | Systems and methods for identifying and suggesting emoticons |
CN104699662A (en) * | 2015-03-18 | 2015-06-10 | 北京交通大学 | Method and device for recognizing whole symbol string |
US20150172242A1 (en) * | 2013-12-13 | 2015-06-18 | Motorola Mobility Llc | Method and System for Achieving Communications in a Manner Accounting for One or More User Preferences or Contexts |
WO2015087084A1 (en) * | 2013-12-12 | 2015-06-18 | Touchtype Limited | System and method for inputting images or labels into electronic devices |
US20150220774A1 (en) * | 2014-02-05 | 2015-08-06 | Facebook, Inc. | Ideograms for Captured Expressions |
US20150222576A1 (en) * | 2013-10-31 | 2015-08-06 | Hill-Rom Services, Inc. | Context-based message creation via user-selectable icons |
WO2015119605A1 (en) * | 2014-02-05 | 2015-08-13 | Facebook, Inc. | Ideograms based on sentiment analysis |
US20150339017A1 (en) * | 2014-05-21 | 2015-11-26 | Ricoh Company, Ltd. | Terminal apparatus, program, method of calling function, and information processing system |
US20150381534A1 (en) * | 2014-06-25 | 2015-12-31 | Convergence Acceleration Solutions, Llc | Systems and methods for indicating emotions through electronic self-portraits |
WO2016007122A1 (en) * | 2014-07-07 | 2016-01-14 | Machine Zone, Inc. | System and method for identifying and suggesting emoticons |
US20160070453A1 (en) * | 2014-09-05 | 2016-03-10 | Verizon Patent And Licensing Inc. | Method and system for indicating social categories |
WO2016044424A1 (en) * | 2014-09-18 | 2016-03-24 | Snapchat, Inc. | Geolocation-based pictographs |
CN105518577A (en) * | 2013-08-26 | 2016-04-20 | 三星电子株式会社 | User device and method for creating handwriting content |
US20160154825A1 (en) * | 2014-11-27 | 2016-06-02 | Inventec (Pudong) Technology Corporation | Emotion image recommendation system and method thereof |
US20160210276A1 (en) * | 2013-10-24 | 2016-07-21 | Sony Corporation | Information processing device, information processing method, and program |
US20160292148A1 (en) * | 2012-12-27 | 2016-10-06 | Touchtype Limited | System and method for inputting images or labels into electronic devices |
US9515968B2 (en) | 2014-02-05 | 2016-12-06 | Facebook, Inc. | Controlling access to ideograms |
CN106293120A (en) * | 2016-07-29 | 2017-01-04 | 维沃移动通信有限公司 | Expression input method and mobile terminal |
CN106372059A (en) * | 2016-08-30 | 2017-02-01 | 北京百度网讯科技有限公司 | Information input method and information input device |
US9594831B2 (en) | 2012-06-22 | 2017-03-14 | Microsoft Technology Licensing, Llc | Targeted disambiguation of named entities |
US20170083506A1 (en) * | 2015-09-21 | 2017-03-23 | International Business Machines Corporation | Suggesting emoji characters based on current contextual emotional state of user |
US20170177554A1 (en) * | 2015-12-18 | 2017-06-22 | International Business Machines Corporation | Culturally relevant emoji character replacement |
US9723127B1 (en) * | 2016-07-12 | 2017-08-01 | Detrice Grayson | Emoticon scripture system |
WO2017150860A1 (en) * | 2016-02-29 | 2017-09-08 | Samsung Electronics Co., Ltd. | Predicting text input based on user demographic information and context information |
US20170289073A1 (en) * | 2016-03-31 | 2017-10-05 | Atlassian Pty Ltd | Systems and methods for providing controls in a messaging interface |
WO2017184213A1 (en) * | 2016-04-20 | 2017-10-26 | Google Inc. | Iconographic suggestions within a keyboard |
US20170308289A1 (en) * | 2016-04-20 | 2017-10-26 | Google Inc. | Iconographic symbol search within a graphical keyboard |
US20170351678A1 (en) * | 2016-06-03 | 2017-12-07 | Facebook, Inc. | Profile Suggestions |
WO2017223011A1 (en) * | 2016-06-23 | 2017-12-28 | Microsoft Technology Licensing, Llc | Emoji prediction by suppression |
US20170374003A1 (en) | 2014-10-02 | 2017-12-28 | Snapchat, Inc. | Ephemeral gallery of ephemeral messages |
US9866999B1 (en) | 2014-01-12 | 2018-01-09 | Investment Asset Holdings Llc | Location-based messaging |
US9882859B2 (en) | 2012-06-25 | 2018-01-30 | Konami Digital Entertainment Co., Ltd. | Message-browsing system, server, terminal device, control method, and recording medium |
US20180059885A1 (en) * | 2012-11-26 | 2018-03-01 | invi Labs, Inc. | System for and method of accessing and selecting emoticons, content, and mood messages during chat sessions |
US20180248821A1 (en) * | 2016-05-06 | 2018-08-30 | Tencent Technology (Shenzhen) Company Limited | Information pushing method, apparatus, and system, and computer storage medium |
EP3370136A4 (en) * | 2015-10-26 | 2018-10-10 | Baidu Online Network Technology (Beijing) Co., Ltd | Input data processing method, apparatus and device, and non-volatile computer storage medium |
US10123166B2 (en) | 2015-01-26 | 2018-11-06 | Snap Inc. | Content request by location |
US10140373B2 (en) * | 2014-04-15 | 2018-11-27 | Facebook, Inc. | Eliciting user sharing of content |
US10152207B2 (en) * | 2015-08-26 | 2018-12-11 | Xiaomi Inc. | Method and device for changing emoticons in a chat interface |
US10154192B1 (en) | 2014-07-07 | 2018-12-11 | Snap Inc. | Apparatus and method for supplying content aware photo filters |
WO2018226352A1 (en) * | 2017-06-09 | 2018-12-13 | Microsoft Technology Licensing, Llc | Emoji suggester and adapted user interface |
US10157449B1 (en) | 2015-01-09 | 2018-12-18 | Snap Inc. | Geo-location-based image filters |
US10165402B1 (en) | 2016-06-28 | 2018-12-25 | Snap Inc. | System to track engagement of media items |
US10182311B2 (en) | 2014-06-13 | 2019-01-15 | Snap Inc. | Prioritization of messages within a message collection |
US10203855B2 (en) | 2016-12-09 | 2019-02-12 | Snap Inc. | Customized user-controlled media overlays |
US10219111B1 (en) | 2018-04-18 | 2019-02-26 | Snap Inc. | Visitation tracking system |
US10223397B1 (en) | 2015-03-13 | 2019-03-05 | Snap Inc. | Social graph based co-location of network users |
WO2019060351A1 (en) | 2017-09-21 | 2019-03-28 | Mz Ip Holdings, Llc | System and method for utilizing memory-efficient data structures for emoji suggestions |
US10248850B2 (en) * | 2015-02-27 | 2019-04-02 | Immersion Corporation | Generating actions based on a user's mood |
US10319149B1 (en) | 2017-02-17 | 2019-06-11 | Snap Inc. | Augmented reality anamorphosis system |
US10327096B1 (en) | 2018-03-06 | 2019-06-18 | Snap Inc. | Geo-fence selection system |
CN109918530A (en) * | 2019-03-04 | 2019-06-21 | 北京字节跳动网络技术有限公司 | Method and apparatus for pushing image |
US10334307B2 (en) | 2011-07-12 | 2019-06-25 | Snap Inc. | Methods and systems of providing visual content editing functions |
US10341826B2 (en) | 2015-08-14 | 2019-07-02 | Apple Inc. | Easy location sharing |
US10348662B2 (en) | 2016-07-19 | 2019-07-09 | Snap Inc. | Generating customized electronic messaging graphics |
US10353542B2 (en) * | 2015-04-02 | 2019-07-16 | Facebook, Inc. | Techniques for context sensitive illustrated graphical user interface elements |
US10354425B2 (en) | 2015-12-18 | 2019-07-16 | Snap Inc. | Method and system for providing context relevant media augmentation |
US10366543B1 (en) | 2015-10-30 | 2019-07-30 | Snap Inc. | Image based tracking in augmented reality systems |
US10387730B1 (en) | 2017-04-20 | 2019-08-20 | Snap Inc. | Augmented reality typography personalization system |
US10387514B1 (en) | 2016-06-30 | 2019-08-20 | Snap Inc. | Automated content curation and communication |
US10402493B2 (en) | 2009-03-30 | 2019-09-03 | Touchtype Ltd | System and method for inputting text into electronic devices |
US10423983B2 (en) | 2014-09-16 | 2019-09-24 | Snap Inc. | Determining targeting information based on a predictive targeting model |
US10430838B1 (en) | 2016-06-28 | 2019-10-01 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections with automated advertising |
US10445425B2 (en) * | 2015-09-15 | 2019-10-15 | Apple Inc. | Emoji and canned responses |
US20190340425A1 (en) * | 2018-05-03 | 2019-11-07 | International Business Machines Corporation | Image obtaining based on emotional status |
US10474321B2 (en) | 2015-11-30 | 2019-11-12 | Snap Inc. | Network resource location linking and visual content sharing |
US10484328B2 (en) | 2012-06-25 | 2019-11-19 | Konami Digital Entertainment Co., Ltd. | Message-browsing system, server, terminal device, control method, and recording medium |
US10499191B1 (en) | 2017-10-09 | 2019-12-03 | Snap Inc. | Context sensitive presentation of content |
US10523625B1 (en) | 2017-03-09 | 2019-12-31 | Snap Inc. | Restricted group content collection |
US10528219B2 (en) | 2015-08-10 | 2020-01-07 | Tung Inc. | Conversion and display of a user input |
US10565219B2 (en) | 2014-05-30 | 2020-02-18 | Apple Inc. | Techniques for automatically generating a suggested contact based on a received message |
US10572681B1 (en) | 2014-05-28 | 2020-02-25 | Snap Inc. | Apparatus and method for automated privacy protection in distributed images |
US10580458B2 (en) | 2014-12-19 | 2020-03-03 | Snap Inc. | Gallery of videos set to an audio time line |
US10579212B2 (en) | 2014-05-30 | 2020-03-03 | Apple Inc. | Structured suggestions |
US10592103B2 (en) * | 2016-11-22 | 2020-03-17 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
US10592574B2 (en) | 2015-05-05 | 2020-03-17 | Snap Inc. | Systems and methods for automated local story generation and curation |
US10616476B1 (en) | 2014-11-12 | 2020-04-07 | Snap Inc. | User interface for accessing media at a geographic location |
US10616239B2 (en) | 2015-03-18 | 2020-04-07 | Snap Inc. | Geo-fence authorization provisioning |
US10623666B2 (en) | 2016-11-07 | 2020-04-14 | Snap Inc. | Selective identification and order of image modifiers |
US10628036B2 (en) | 2016-01-18 | 2020-04-21 | Microsoft Technology Licensing, Llc | Keyboard customization |
US10652287B2 (en) | 2015-01-20 | 2020-05-12 | Tencent Technology (Shenzhen) Company Limited | Method, device, and system for managing information recommendation |
US10659405B1 (en) | 2019-05-06 | 2020-05-19 | Apple Inc. | Avatar integration with multiple applications |
US10679389B2 (en) | 2016-02-26 | 2020-06-09 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections |
US10679393B2 (en) | 2018-07-24 | 2020-06-09 | Snap Inc. | Conditional modification of augmented reality object |
US10678818B2 (en) | 2018-01-03 | 2020-06-09 | Snap Inc. | Tag distribution visualization system |
US10684771B2 (en) | 2013-08-26 | 2020-06-16 | Samsung Electronics Co., Ltd. | User device and method for creating handwriting content |
US10740974B1 (en) | 2017-09-15 | 2020-08-11 | Snap Inc. | Augmented reality system |
US10769607B2 (en) * | 2014-10-08 | 2020-09-08 | Jgist, Inc. | Universal symbol system language-one world language |
US10817898B2 (en) | 2015-08-13 | 2020-10-27 | Placed, Llc | Determining exposures to content presented by physical objects |
US10834525B2 (en) | 2016-02-26 | 2020-11-10 | Snap Inc. | Generation, curation, and presentation of media collections |
US10862951B1 (en) | 2007-01-05 | 2020-12-08 | Snap Inc. | Real-time display of multiple images |
US10871877B1 (en) * | 2018-11-30 | 2020-12-22 | Facebook, Inc. | Content-based contextual reactions for posts on a social networking system |
US10877629B2 (en) | 2016-10-13 | 2020-12-29 | Tung Inc. | Conversion and display of a user input |
US10885136B1 (en) | 2018-02-28 | 2021-01-05 | Snap Inc. | Audience filtering system |
US10911575B1 (en) | 2015-05-05 | 2021-02-02 | Snap Inc. | Systems and methods for story and sub-story navigation |
US10915911B2 (en) | 2017-02-03 | 2021-02-09 | Snap Inc. | System to determine a price-schedule to distribute media content |
US10933311B2 (en) | 2018-03-14 | 2021-03-02 | Snap Inc. | Generating collectible items based on location information |
US10952013B1 (en) | 2017-04-27 | 2021-03-16 | Snap Inc. | Selective location-based identity communication |
US10948717B1 (en) | 2015-03-23 | 2021-03-16 | Snap Inc. | Reducing boot time and power consumption in wearable display systems |
CN112540756A (en) * | 2020-12-01 | 2021-03-23 | 杭州讯酷科技有限公司 | UI (user interface) construction method based on cursor position recommendation field |
US10963529B1 (en) | 2017-04-27 | 2021-03-30 | Snap Inc. | Location-based search mechanism in a graphical user interface |
US10979752B1 (en) | 2018-02-28 | 2021-04-13 | Snap Inc. | Generating media content items based on location information |
US10993069B2 (en) | 2015-07-16 | 2021-04-27 | Snap Inc. | Dynamically adaptive media content delivery |
US10997760B2 (en) | 2018-08-31 | 2021-05-04 | Snap Inc. | Augmented reality anthropomorphization system |
US10997783B2 (en) | 2015-11-30 | 2021-05-04 | Snap Inc. | Image and point cloud based tracking and in augmented reality systems |
US11017173B1 (en) | 2017-12-22 | 2021-05-25 | Snap Inc. | Named entity recognition visual context and caption data |
US11025565B2 (en) | 2015-06-07 | 2021-06-01 | Apple Inc. | Personalized prediction of responses for instant messaging |
US11023514B2 (en) | 2016-02-26 | 2021-06-01 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections |
US11030787B2 (en) | 2017-10-30 | 2021-06-08 | Snap Inc. | Mobile-based cartographic control of display content |
US11038829B1 (en) | 2014-10-02 | 2021-06-15 | Snap Inc. | Ephemeral gallery of ephemeral messages with opt-in permanence |
US11037372B2 (en) | 2017-03-06 | 2021-06-15 | Snap Inc. | Virtual vision system |
US11044218B1 (en) * | 2020-10-23 | 2021-06-22 | Slack Technologies, Inc. | Systems and methods for reacting to messages |
US11074408B2 (en) | 2019-06-01 | 2021-07-27 | Apple Inc. | Mail application features |
US11103161B2 (en) | 2018-05-07 | 2021-08-31 | Apple Inc. | Displaying user interfaces associated with physical activities |
US11128715B1 (en) | 2019-12-30 | 2021-09-21 | Snap Inc. | Physical friend proximity in chat |
US11163941B1 (en) | 2018-03-30 | 2021-11-02 | Snap Inc. | Annotating a collection of media content items |
US11170393B1 (en) | 2017-04-11 | 2021-11-09 | Snap Inc. | System to calculate an engagement score of location based media content |
US11182383B1 (en) | 2012-02-24 | 2021-11-23 | Placed, Llc | System and method for data collection to validate location data |
US11189299B1 (en) | 2017-02-20 | 2021-11-30 | Snap Inc. | Augmented reality speech balloon system |
US11194467B2 (en) | 2019-06-01 | 2021-12-07 | Apple Inc. | Keyboard management user interfaces |
US11199957B1 (en) | 2018-11-30 | 2021-12-14 | Snap Inc. | Generating customized avatars based on location information |
US11200503B2 (en) | 2012-12-27 | 2021-12-14 | Microsoft Technology Licensing, Llc | Search system and corresponding method |
US11206615B2 (en) | 2019-05-30 | 2021-12-21 | Snap Inc. | Wearable device location systems |
US11216869B2 (en) | 2014-09-23 | 2022-01-04 | Snap Inc. | User interface to augment an image using geolocation |
US11218838B2 (en) | 2019-10-31 | 2022-01-04 | Snap Inc. | Focused map-based context information surfacing |
US11228551B1 (en) | 2020-02-12 | 2022-01-18 | Snap Inc. | Multiple gateway message exchange |
US11232040B1 (en) | 2017-04-28 | 2022-01-25 | Snap Inc. | Precaching unlockable data elements |
US11249614B2 (en) | 2019-03-28 | 2022-02-15 | Snap Inc. | Generating personalized map interface with enhanced icons |
US11249617B1 (en) | 2015-01-19 | 2022-02-15 | Snap Inc. | Multichannel system |
US11250075B1 (en) | 2017-02-17 | 2022-02-15 | Snap Inc. | Searching social media content |
US11265273B1 (en) | 2017-12-01 | 2022-03-01 | Snap, Inc. | Dynamic media overlay with smart widget |
US20220078295A1 (en) * | 2020-09-08 | 2022-03-10 | Fujifilm Business Innovation Corp. | Information processing apparatus and system and non-transitory computer readable medium |
US11290851B2 (en) | 2020-06-15 | 2022-03-29 | Snap Inc. | Location sharing using offline and online objects |
US11297399B1 (en) | 2017-03-27 | 2022-04-05 | Snap Inc. | Generating a stitched data stream |
US11294936B1 (en) | 2019-01-30 | 2022-04-05 | Snap Inc. | Adaptive spatial density based clustering |
US11301117B2 (en) | 2019-03-08 | 2022-04-12 | Snap Inc. | Contextual information in chat |
US11314776B2 (en) | 2020-06-15 | 2022-04-26 | Snap Inc. | Location sharing using friend list versions |
US11343323B2 (en) | 2019-12-31 | 2022-05-24 | Snap Inc. | Augmented reality objects registry |
US11349796B2 (en) | 2017-03-27 | 2022-05-31 | Snap Inc. | Generating a stitched data stream |
US11361493B2 (en) | 2019-04-01 | 2022-06-14 | Snap Inc. | Semantic texture mapping system |
US20220191158A1 (en) * | 2020-12-15 | 2022-06-16 | Kakao Corp. | Method and server for providing content list and operating method of user terminal |
US11372608B2 (en) | 2014-12-19 | 2022-06-28 | Snap Inc. | Gallery of messages from individuals with a shared interest |
US11388226B1 (en) | 2015-01-13 | 2022-07-12 | Snap Inc. | Guided personal identity based actions |
CN114816599A (en) * | 2021-01-22 | 2022-07-29 | 北京字跳网络技术有限公司 | Image display method, apparatus, device and medium |
US11423596B2 (en) * | 2017-10-23 | 2022-08-23 | Paypal, Inc. | System and method for generating emoji mashups with machine learning |
US20220269354A1 (en) * | 2020-06-19 | 2022-08-25 | Talent Unlimited Online Services Private Limited | Artificial intelligence-based system and method for dynamically predicting and suggesting emojis for messages |
US11430091B2 (en) | 2020-03-27 | 2022-08-30 | Snap Inc. | Location mapping for large scale augmented-reality |
US11429618B2 (en) | 2019-12-30 | 2022-08-30 | Snap Inc. | Surfacing augmented reality objects |
US20220279240A1 (en) * | 2021-03-01 | 2022-09-01 | Comcast Cable Communications, Llc | Systems and methods for providing contextually relevant information |
US11455082B2 (en) | 2018-09-28 | 2022-09-27 | Snap Inc. | Collaborative achievement interface |
US11475254B1 (en) | 2017-09-08 | 2022-10-18 | Snap Inc. | Multimodal entity identification |
US11483267B2 (en) | 2020-06-15 | 2022-10-25 | Snap Inc. | Location sharing using different rate-limited links |
US11494547B2 (en) | 2016-04-13 | 2022-11-08 | Microsoft Technology Licensing, Llc | Inputting images to electronic devices |
US11496425B1 (en) * | 2018-05-10 | 2022-11-08 | Whatsapp Llc | Modifying message content based on user preferences |
US11500525B2 (en) | 2019-02-25 | 2022-11-15 | Snap Inc. | Custom media overlay system |
US11503432B2 (en) | 2020-06-15 | 2022-11-15 | Snap Inc. | Scalable real-time location sharing framework |
US11507614B1 (en) | 2018-02-13 | 2022-11-22 | Snap Inc. | Icon based tagging |
US11516167B2 (en) | 2020-03-05 | 2022-11-29 | Snap Inc. | Storing data based on device location |
US20230004277A1 (en) * | 2021-06-30 | 2023-01-05 | Slack Technologies, Inc. | User interface for searching content of a communication platform using reaction icons |
US11558709B2 (en) | 2018-11-30 | 2023-01-17 | Snap Inc. | Position service to determine relative position to map features |
US20230035961A1 (en) * | 2021-07-29 | 2023-02-02 | Fannie Liu | Emoji recommendation system using user context and biosignals |
US11573679B2 (en) * | 2018-04-30 | 2023-02-07 | The Trustees of the California State University | Integration of user emotions for a smartphone or other communication device environment |
US11574431B2 (en) | 2019-02-26 | 2023-02-07 | Snap Inc. | Avatar based on weather |
US11601783B2 (en) | 2019-06-07 | 2023-03-07 | Snap Inc. | Detection of a physical collision between two client devices in a location sharing system |
US11601888B2 (en) | 2021-03-29 | 2023-03-07 | Snap Inc. | Determining location using multi-source geolocation data |
US11606755B2 (en) | 2019-05-30 | 2023-03-14 | Snap Inc. | Wearable device location systems architecture |
US11616745B2 (en) | 2017-01-09 | 2023-03-28 | Snap Inc. | Contextual generation and selection of customized media content |
US11619501B2 (en) | 2020-03-11 | 2023-04-04 | Snap Inc. | Avatar based on trip |
US11620001B2 (en) * | 2017-06-29 | 2023-04-04 | Snap Inc. | Pictorial symbol prediction |
US11625443B2 (en) | 2014-06-05 | 2023-04-11 | Snap Inc. | Web document enhancement |
US11631276B2 (en) | 2016-03-31 | 2023-04-18 | Snap Inc. | Automated avatar generation |
US20230123271A1 (en) * | 2021-10-20 | 2023-04-20 | International Business Machines Corporation | Decoding communications with token sky maps |
US11645324B2 (en) | 2021-03-31 | 2023-05-09 | Snap Inc. | Location-based timeline media content system |
US11676378B2 (en) | 2020-06-29 | 2023-06-13 | Snap Inc. | Providing travel-based augmented reality content with a captured image |
US11675831B2 (en) | 2017-05-31 | 2023-06-13 | Snap Inc. | Geolocation based playlists |
US11714535B2 (en) | 2019-07-11 | 2023-08-01 | Snap Inc. | Edge gesture interface with smart interactions |
US11734712B2 (en) | 2012-02-24 | 2023-08-22 | Foursquare Labs, Inc. | Attributing in-store visits to media consumption based on data collected from user devices |
US11751015B2 (en) | 2019-01-16 | 2023-09-05 | Snap Inc. | Location-based context information sharing in a messaging system |
US11776256B2 (en) | 2020-03-27 | 2023-10-03 | Snap Inc. | Shared augmented reality system |
US11782575B2 (en) | 2018-05-07 | 2023-10-10 | Apple Inc. | User interfaces for sharing contextually relevant media content |
US11799811B2 (en) | 2018-10-31 | 2023-10-24 | Snap Inc. | Messaging and gaming applications communication platform |
US11809624B2 (en) | 2019-02-13 | 2023-11-07 | Snap Inc. | Sleep detection in a location sharing system |
US11816853B2 (en) | 2016-08-30 | 2023-11-14 | Snap Inc. | Systems and methods for simultaneous localization and mapping |
US11821742B2 (en) | 2019-09-26 | 2023-11-21 | Snap Inc. | Travel based notifications |
US11829834B2 (en) | 2021-10-29 | 2023-11-28 | Snap Inc. | Extended QR code |
US11843456B2 (en) | 2016-10-24 | 2023-12-12 | Snap Inc. | Generating and displaying customized avatars in media overlays |
US11842411B2 (en) | 2017-04-27 | 2023-12-12 | Snap Inc. | Location-based virtual avatars |
US11852554B1 (en) | 2019-03-21 | 2023-12-26 | Snap Inc. | Barometer calibration in a location sharing system |
US11860888B2 (en) | 2018-05-22 | 2024-01-02 | Snap Inc. | Event detection system |
US11868414B1 (en) | 2019-03-14 | 2024-01-09 | Snap Inc. | Graph-based prediction for contact suggestion in a location sharing system |
US11870743B1 (en) | 2017-01-23 | 2024-01-09 | Snap Inc. | Customized digital avatar accessories |
US11877211B2 (en) | 2019-01-14 | 2024-01-16 | Snap Inc. | Destination sharing in location sharing system |
US11893208B2 (en) | 2019-12-31 | 2024-02-06 | Snap Inc. | Combined map icon with action indicator |
US11925869B2 (en) | 2012-05-08 | 2024-03-12 | Snap Inc. | System and method for generating and displaying avatars |
US11943192B2 (en) | 2020-08-31 | 2024-03-26 | Snap Inc. | Co-location connection service |
US11972529B2 (en) | 2019-02-01 | 2024-04-30 | Snap Inc. | Augmented reality system |
US12001750B2 (en) | 2022-04-20 | 2024-06-04 | Snap Inc. | Location-based shared augmented reality experience system |
US12020386B2 (en) | 2022-06-23 | 2024-06-25 | Snap Inc. | Applying pregenerated virtual experiences in new location |
US12020384B2 (en) | 2022-06-21 | 2024-06-25 | Snap Inc. | Integrating augmented reality experiences with other components |
US12026362B2 (en) | 2021-05-19 | 2024-07-02 | Snap Inc. | Video editing application for mobile devices |
US20240220717A1 (en) * | 2022-12-28 | 2024-07-04 | Twilio Inc. | Adding theme-based content to messages using artificial intelligence |
US12127068B2 (en) | 2020-07-30 | 2024-10-22 | Investment Asset Holdings Llc | Map interface with icon for location-based messages |
Families Citing this family (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6157299B2 (en) * | 2013-09-27 | 2017-07-05 | Kddi株式会社 | Communication terminal, management server, message exchange system, message exchange method, and message exchange program |
KR102337072B1 (en) * | 2014-09-12 | 2021-12-08 | 삼성전자 주식회사 | Method for making emoticon and electronic device implementing the same |
KR20160089152A (en) | 2015-01-19 | 2016-07-27 | 주식회사 엔씨소프트 | Method and computer system of analyzing communication situation based on dialogue act information |
KR101641572B1 (en) * | 2015-01-19 | 2016-07-21 | 주식회사 엔씨소프트 | Method and computer program of ordering dialogue sticker ranking based on situation and preference information |
KR101615848B1 (en) * | 2015-01-19 | 2016-04-26 | 주식회사 엔씨소프트 | Method and computer program of recommending dialogue sticker based on similar situation detection |
KR101634086B1 (en) * | 2015-01-19 | 2016-07-08 | 주식회사 엔씨소프트 | Method and computer system of analyzing communication situation based on emotion information |
US10623352B2 (en) | 2015-03-30 | 2020-04-14 | International Business Machines Corporation | Modification of electronic messages |
CN105094363A (en) * | 2015-07-06 | 2015-11-25 | 百度在线网络技术(北京)有限公司 | Method and apparatus for processing emotion signal |
US10203843B2 (en) | 2015-09-21 | 2019-02-12 | Microsoft Technology Licensing, Llc | Facilitating selection of attribute values for graphical elements |
US10540431B2 (en) | 2015-11-23 | 2020-01-21 | Microsoft Technology Licensing, Llc | Emoji reactions for file content and associated activities |
USD852839S1 (en) * | 2015-12-23 | 2019-07-02 | Beijing Xinmei Hutong Technology Co., Ltd | Display screen with a graphical user interface |
CN108701125A (en) * | 2015-12-29 | 2018-10-23 | Mz知识产权控股有限责任公司 | System and method for suggesting emoticon |
WO2017120924A1 (en) * | 2016-01-15 | 2017-07-20 | 李强生 | Information prompting method for use when inserting emoticon, and instant communication tool |
CN105700703A (en) * | 2016-02-24 | 2016-06-22 | 北京小牛互联科技有限公司 | Method and device for inserting expressions in character input interface of keyboard and supporting user-defined expressions |
US10686899B2 (en) | 2016-04-06 | 2020-06-16 | Snap Inc. | Messaging achievement pictograph display system |
KR101780809B1 (en) * | 2016-05-09 | 2017-09-22 | 네이버 주식회사 | Method, user terminal, server and computer program for providing translation with emoticon |
US11115463B2 (en) | 2016-08-17 | 2021-09-07 | Microsoft Technology Licensing, Llc | Remote and local predictions |
US20180081500A1 (en) * | 2016-09-19 | 2018-03-22 | Facebook, Inc. | Systems and methods for content engagement |
CN106502515B (en) * | 2016-09-30 | 2020-01-14 | 维沃移动通信有限公司 | Picture input method and mobile terminal |
CN106503744A (en) * | 2016-10-26 | 2017-03-15 | 长沙军鸽软件有限公司 | Input expression in chat process carries out the method and device of automatic error-correcting |
US10452411B2 (en) | 2016-12-30 | 2019-10-22 | Riyad Mobeen | System and method of using emojis for interactive and enhanced data matching capabilities |
CN107093164A (en) * | 2017-04-26 | 2017-08-25 | 北京百度网讯科技有限公司 | Method and apparatus for generating image |
CN108173747B (en) * | 2017-12-27 | 2021-10-22 | 上海传英信息技术有限公司 | Information interaction method and device |
US10970329B1 (en) * | 2018-03-30 | 2021-04-06 | Snap Inc. | Associating a graphical element to media content item collections |
KR20240017141A (en) | 2018-08-31 | 2024-02-06 | 구글 엘엘씨 | Methods and systems for positioning animated images within a dynamic keyboard interface |
US10652182B1 (en) * | 2018-11-01 | 2020-05-12 | International Business Machines Corporation | Unlocking emoticons based on professional skills |
JP6636604B2 (en) * | 2018-12-12 | 2020-01-29 | 株式会社コロプラ | Emotion text display program, method and system |
CN110162670B (en) * | 2019-05-27 | 2020-05-08 | 北京字节跳动网络技术有限公司 | Method and device for generating expression package |
US20220291789A1 (en) * | 2019-07-11 | 2022-09-15 | Google Llc | System and Method for Providing an Artificial Intelligence Control Surface for a User of a Computing Device |
CN111131006B (en) * | 2019-12-31 | 2021-05-18 | 联想(北京)有限公司 | Information processing method |
US11209964B1 (en) * | 2020-06-05 | 2021-12-28 | SlackTechnologies, LLC | System and method for reacting to messages |
EP4300409A4 (en) | 2021-08-26 | 2024-08-28 | Samsung Electronics Co Ltd | Method and device for generating emotional combination content |
DE102022110951A1 (en) | 2022-05-04 | 2023-11-09 | fm menschenbezogen GmbH | Device for selecting a training and/or usage recommendation and/or a characterization |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6269189B1 (en) * | 1998-12-29 | 2001-07-31 | Xerox Corporation | Finding selected character strings in text and providing information relating to the selected character strings |
US20060015812A1 (en) * | 2004-07-15 | 2006-01-19 | Cingular Wireless Ii, Llc | Using emoticons, such as for wireless devices |
US6990452B1 (en) * | 2000-11-03 | 2006-01-24 | At&T Corp. | Method for sending multi-media messages using emoticons |
US20100179991A1 (en) * | 2006-01-16 | 2010-07-15 | Zlango Ltd. | Iconic Communication |
US8065601B2 (en) * | 2006-08-03 | 2011-11-22 | Apple Inc. | System and method for tagging data |
US8299943B2 (en) * | 2007-05-22 | 2012-10-30 | Tegic Communications, Inc. | Multiple predictions in a reduced keyboard disambiguating system |
US8547354B2 (en) * | 2010-11-05 | 2013-10-01 | Apple Inc. | Device, method, and graphical user interface for manipulating soft keyboards |
Family Cites Families (56)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4706212A (en) | 1971-08-31 | 1987-11-10 | Toma Peter P | Method using a programmed digital computer system for translation between natural languages |
JP3393162B2 (en) * | 1994-03-31 | 2003-04-07 | シャープ株式会社 | Text editing device |
US5805911A (en) | 1995-02-01 | 1998-09-08 | Microsoft Corporation | Word prediction system |
US7319957B2 (en) * | 2004-02-11 | 2008-01-15 | Tegic Communications, Inc. | Handwriting and voice input with automatic correction |
US7051019B1 (en) * | 1999-08-17 | 2006-05-23 | Corbis Corporation | Method and system for obtaining images from a database having images that are relevant to indicated text |
JP2003256586A (en) | 2002-02-28 | 2003-09-12 | Fuji Photo Film Co Ltd | Animation character generation system |
JP4232093B2 (en) * | 2003-06-30 | 2009-03-04 | 日本電気株式会社 | Emoticon input system and method using multiple search methods |
JP2005063245A (en) * | 2003-08-18 | 2005-03-10 | Nippon Telegr & Teleph Corp <Ntt> | Input support system, terminal device with input support function, and its program |
US20050192802A1 (en) * | 2004-02-11 | 2005-09-01 | Alex Robinson | Handwriting and voice input with automatic correction |
WO2006075334A2 (en) | 2005-01-16 | 2006-07-20 | Zlango Ltd. | Iconic communication |
WO2006075335A2 (en) | 2005-01-16 | 2006-07-20 | Zlango Ltd. | Communications network system and methods for using same |
US7506254B2 (en) | 2005-04-21 | 2009-03-17 | Google Inc. | Predictive conversion of user input |
US8843482B2 (en) | 2005-10-28 | 2014-09-23 | Telecom Italia S.P.A. | Method of providing selected content items to a user |
IL173011A (en) | 2006-01-08 | 2012-01-31 | Picscout Ltd | Image insertion for cellular text messaging |
EP1977617A2 (en) | 2006-01-16 | 2008-10-08 | Zlango Ltd. | Activating an application |
EP1982450A2 (en) | 2006-01-16 | 2008-10-22 | Zlango Ltd. | Communications network system and methods for using same |
JP4730114B2 (en) | 2006-01-30 | 2011-07-20 | 日本電気株式会社 | Message creation support method and portable terminal |
US8166418B2 (en) * | 2006-05-26 | 2012-04-24 | Zi Corporation Of Canada, Inc. | Device and method of conveying meaning |
TW200809543A (en) | 2006-08-10 | 2008-02-16 | Inventec Corp | Mail-editing system and method |
JP5098304B2 (en) * | 2006-11-17 | 2012-12-12 | 日本電気株式会社 | Special character input support device and electronic device including the same |
JP2008225602A (en) * | 2007-03-09 | 2008-09-25 | Ntt Docomo Inc | Pictogram search apparatus, system and method |
US20080244446A1 (en) * | 2007-03-29 | 2008-10-02 | Lefevre John | Disambiguation of icons and other media in text-based applications |
JP2009110056A (en) | 2007-10-26 | 2009-05-21 | Panasonic Corp | Communication device |
CN100570545C (en) * | 2007-12-17 | 2009-12-16 | 腾讯科技(深圳)有限公司 | expression input method and device |
JP5290570B2 (en) | 2007-12-26 | 2013-09-18 | 京セラ株式会社 | Emoji input support device, pictogram input support method, and program |
WO2009128838A1 (en) | 2008-04-18 | 2009-10-22 | Tegic Communications, Inc. | Disambiguation of icons and other media in text-based applications |
US9646078B2 (en) | 2008-05-12 | 2017-05-09 | Groupon, Inc. | Sentiment extraction from consumer reviews for providing product recommendations |
US8458153B2 (en) * | 2008-08-26 | 2013-06-04 | Michael Pierce | Web-based services for querying and matching likes and dislikes of individuals |
US20100131447A1 (en) * | 2008-11-26 | 2010-05-27 | Nokia Corporation | Method, Apparatus and Computer Program Product for Providing an Adaptive Word Completion Mechanism |
US20100332287A1 (en) | 2009-06-24 | 2010-12-30 | International Business Machines Corporation | System and method for real-time prediction of customer satisfaction |
JP5267450B2 (en) * | 2009-12-28 | 2013-08-21 | 株式会社デンソー | Electronic device and program |
JP2011171891A (en) * | 2010-02-17 | 2011-09-01 | Panasonic Corp | Portable terminal |
JP5398638B2 (en) | 2010-05-25 | 2014-01-29 | 日本電信電話株式会社 | Symbol input support device, symbol input support method, and program |
CN101820475A (en) * | 2010-05-25 | 2010-09-01 | 拓维信息系统股份有限公司 | Cell phone multimedia message generating method based on intelligent semantic understanding |
CN103229168B (en) | 2010-09-28 | 2016-10-19 | 国际商业机器公司 | The method and system that evidence spreads between multiple candidate answers during question and answer |
US9536269B2 (en) | 2011-01-19 | 2017-01-03 | 24/7 Customer, Inc. | Method and apparatus for analyzing and applying data related to customer interactions with social media |
US8548951B2 (en) | 2011-03-10 | 2013-10-01 | Textwise Llc | Method and system for unified information representation and applications thereof |
US8903713B2 (en) | 2011-11-19 | 2014-12-02 | Richard L. Peterson | Method and apparatus for automatically analyzing natural language to extract useful information |
US9348479B2 (en) | 2011-12-08 | 2016-05-24 | Microsoft Technology Licensing, Llc | Sentiment aware user interface customization |
US20130159919A1 (en) | 2011-12-19 | 2013-06-20 | Gabriel Leydon | Systems and Methods for Identifying and Suggesting Emoticons |
US20130247078A1 (en) * | 2012-03-19 | 2013-09-19 | Rawllin International Inc. | Emoticons for media |
US9483730B2 (en) | 2012-12-07 | 2016-11-01 | At&T Intellectual Property I, L.P. | Hybrid review synthesis |
GB201322037D0 (en) | 2013-12-12 | 2014-01-29 | Touchtype Ltd | System and method for inputting images/labels into electronic devices |
US20140278786A1 (en) * | 2013-03-14 | 2014-09-18 | Twain Liu-Qiu-Yan | System and method to survey and evaluate items according to people's perceptions and to generate recommendations based on people's perceptions |
US9613023B2 (en) | 2013-04-04 | 2017-04-04 | Wayne M. Kennard | System and method for generating ethnic and cultural emoticon language dictionaries |
US20140365208A1 (en) | 2013-06-05 | 2014-12-11 | Microsoft Corporation | Classification of affective states in social media |
US9311467B2 (en) | 2013-08-20 | 2016-04-12 | International Business Machines Corporation | Composite propensity profile detector |
US10706367B2 (en) | 2013-09-10 | 2020-07-07 | Facebook, Inc. | Sentiment polarity for users of a social networking system |
US20150100537A1 (en) | 2013-10-03 | 2015-04-09 | Microsoft Corporation | Emoji for Text Predictions |
US20150199609A1 (en) | 2013-12-20 | 2015-07-16 | Xurmo Technologies Pvt. Ltd | Self-learning system for determining the sentiment conveyed by an input text |
US10013601B2 (en) | 2014-02-05 | 2018-07-03 | Facebook, Inc. | Ideograms for captured expressions |
US10038786B2 (en) | 2014-03-05 | 2018-07-31 | [24]7.ai, Inc. | Method and apparatus for improving goal-directed textual conversations between agents and customers |
RU2571373C2 (en) | 2014-03-31 | 2015-12-20 | Общество с ограниченной ответственностью "Аби ИнфоПоиск" | Method of analysing text data tonality |
US9043196B1 (en) | 2014-07-07 | 2015-05-26 | Machine Zone, Inc. | Systems and methods for identifying and suggesting emoticons |
US20160048768A1 (en) | 2014-08-15 | 2016-02-18 | Here Global B.V. | Topic Model For Comments Analysis And Use Thereof |
US10552759B2 (en) | 2014-12-01 | 2020-02-04 | Facebook, Inc. | Iterative classifier training on online social networks |
-
2011
- 2011-12-19 US US13/330,357 patent/US20130159919A1/en not_active Abandoned
-
2012
- 2012-12-19 CA CA2859811A patent/CA2859811A1/en not_active Abandoned
- 2012-12-19 EP EP18000201.6A patent/EP3352092A1/en not_active Withdrawn
- 2012-12-19 KR KR1020147020236A patent/KR20140105841A/en active Search and Examination
- 2012-12-19 NZ NZ717653A patent/NZ717653A/en not_active IP Right Cessation
- 2012-12-19 MY MYPI2014001804A patent/MY170666A/en unknown
- 2012-12-19 CN CN201280068550.6A patent/CN104335607A/en active Pending
- 2012-12-19 MY MYPI2018000076A patent/MY189954A/en unknown
- 2012-12-19 JP JP2014548845A patent/JP6254534B2/en not_active Expired - Fee Related
- 2012-12-19 BR BR112014015219A patent/BR112014015219A2/en not_active Application Discontinuation
- 2012-12-19 EP EP12858827.4A patent/EP2795441B1/en not_active Not-in-force
- 2012-12-19 MX MX2014007479A patent/MX2014007479A/en unknown
- 2012-12-19 AU AU2012358964A patent/AU2012358964B2/en not_active Ceased
- 2012-12-19 SG SG11201403373XA patent/SG11201403373XA/en unknown
- 2012-12-19 RU RU2014129554A patent/RU2014129554A/en unknown
- 2012-12-19 SG SG10201707573WA patent/SG10201707573WA/en unknown
- 2012-12-19 WO PCT/US2012/070677 patent/WO2013096482A2/en active Application Filing
- 2012-12-19 NZ NZ627285A patent/NZ627285A/en not_active IP Right Cessation
-
2014
- 2014-04-02 US US14/243,042 patent/US8909513B2/en not_active Expired - Fee Related
- 2014-07-16 ZA ZA2014/05207A patent/ZA201405207B/en unknown
- 2014-12-08 US US14/563,004 patent/US9075794B2/en not_active Expired - Fee Related
-
2015
- 2015-04-15 HK HK15103701.7A patent/HK1203238A1/en not_active IP Right Cessation
- 2015-06-08 US US14/733,112 patent/US9244907B2/en not_active Expired - Fee Related
- 2015-12-21 US US14/976,925 patent/US10254917B2/en not_active Expired - Fee Related
-
2016
- 2016-06-15 AU AU2016204020A patent/AU2016204020B2/en not_active Ceased
- 2016-07-21 AU AU2016206331A patent/AU2016206331B1/en not_active Ceased
- 2016-11-02 AU AU2016253602A patent/AU2016253602B2/en not_active Ceased
-
2017
- 2017-11-30 JP JP2017230299A patent/JP6563465B2/en not_active Expired - Fee Related
-
2019
- 2019-02-06 AU AU2019200788A patent/AU2019200788A1/en not_active Abandoned
- 2019-02-22 US US16/282,447 patent/US20190187879A1/en not_active Abandoned
- 2019-07-24 JP JP2019136207A patent/JP2019207726A/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6269189B1 (en) * | 1998-12-29 | 2001-07-31 | Xerox Corporation | Finding selected character strings in text and providing information relating to the selected character strings |
US6990452B1 (en) * | 2000-11-03 | 2006-01-24 | At&T Corp. | Method for sending multi-media messages using emoticons |
US7921013B1 (en) * | 2000-11-03 | 2011-04-05 | At&T Intellectual Property Ii, L.P. | System and method for sending multi-media messages using emoticons |
US20060015812A1 (en) * | 2004-07-15 | 2006-01-19 | Cingular Wireless Ii, Llc | Using emoticons, such as for wireless devices |
US20100179991A1 (en) * | 2006-01-16 | 2010-07-15 | Zlango Ltd. | Iconic Communication |
US8065601B2 (en) * | 2006-08-03 | 2011-11-22 | Apple Inc. | System and method for tagging data |
US8549391B2 (en) * | 2006-08-03 | 2013-10-01 | Apple Inc. | System and method for tagging data |
US8299943B2 (en) * | 2007-05-22 | 2012-10-30 | Tegic Communications, Inc. | Multiple predictions in a reduced keyboard disambiguating system |
US8547354B2 (en) * | 2010-11-05 | 2013-10-01 | Apple Inc. | Device, method, and graphical user interface for manipulating soft keyboards |
Cited By (459)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10862951B1 (en) | 2007-01-05 | 2020-12-08 | Snap Inc. | Real-time display of multiple images |
US11588770B2 (en) | 2007-01-05 | 2023-02-21 | Snap Inc. | Real-time display of multiple images |
US10402493B2 (en) | 2009-03-30 | 2019-09-03 | Touchtype Ltd | System and method for inputting text into electronic devices |
US10999623B2 (en) | 2011-07-12 | 2021-05-04 | Snap Inc. | Providing visual content editing functions |
US11750875B2 (en) | 2011-07-12 | 2023-09-05 | Snap Inc. | Providing visual content editing functions |
US11451856B2 (en) | 2011-07-12 | 2022-09-20 | Snap Inc. | Providing visual content editing functions |
US10334307B2 (en) | 2011-07-12 | 2019-06-25 | Snap Inc. | Methods and systems of providing visual content editing functions |
US20130332308A1 (en) * | 2011-11-21 | 2013-12-12 | Facebook, Inc. | Method for recommending a gift to a sender |
US20140214409A1 (en) * | 2011-12-19 | 2014-07-31 | Machine Zone, Inc | Systems and Methods for Identifying and Suggesting Emoticons |
US8909513B2 (en) * | 2011-12-19 | 2014-12-09 | Machine Zone, Inc. | Systems and methods for identifying and suggesting emoticons |
US9244907B2 (en) | 2011-12-19 | 2016-01-26 | Machine Zone, Inc. | Systems and methods for identifying and suggesting emoticons |
US10254917B2 (en) * | 2011-12-19 | 2019-04-09 | Mz Ip Holdings, Llc | Systems and methods for identifying and suggesting emoticons |
US9075794B2 (en) | 2011-12-19 | 2015-07-07 | Machine Zone, Inc. | Systems and methods for identifying and suggesting emoticons |
US20160110058A1 (en) * | 2011-12-19 | 2016-04-21 | Machine Zone, Inc. | Systems and methods for identifying and suggesting emoticons |
US11734712B2 (en) | 2012-02-24 | 2023-08-22 | Foursquare Labs, Inc. | Attributing in-store visits to media consumption based on data collected from user devices |
US11182383B1 (en) | 2012-02-24 | 2021-11-23 | Placed, Llc | System and method for data collection to validate location data |
US11925869B2 (en) | 2012-05-08 | 2024-03-12 | Snap Inc. | System and method for generating and displaying avatars |
US20130307779A1 (en) * | 2012-05-17 | 2013-11-21 | Bad Donkey Social, LLC | Systems, methods, and devices for electronic communication |
US9594831B2 (en) | 2012-06-22 | 2017-03-14 | Microsoft Technology Licensing, Llc | Targeted disambiguation of named entities |
US20150113439A1 (en) * | 2012-06-25 | 2015-04-23 | Konami Digital Entertainment Co., Ltd. | Message-browsing system, server, terminal device, control method, and recording medium |
US9954812B2 (en) * | 2012-06-25 | 2018-04-24 | Konami Digital Entertainment Co., Ltd. | Message-browsing system, server, terminal device, control method, and recording medium |
US9882859B2 (en) | 2012-06-25 | 2018-01-30 | Konami Digital Entertainment Co., Ltd. | Message-browsing system, server, terminal device, control method, and recording medium |
US10484328B2 (en) | 2012-06-25 | 2019-11-19 | Konami Digital Entertainment Co., Ltd. | Message-browsing system, server, terminal device, control method, and recording medium |
US9959340B2 (en) * | 2012-06-29 | 2018-05-01 | Microsoft Technology Licensing, Llc | Semantic lexicon-based input method editor |
US20150121290A1 (en) * | 2012-06-29 | 2015-04-30 | Microsoft Corporation | Semantic Lexicon-Based Input Method Editor |
US20140019878A1 (en) * | 2012-07-12 | 2014-01-16 | KamaGames Ltd. | System and method for reflecting player emotional state in an in-game character |
US20140092130A1 (en) * | 2012-09-28 | 2014-04-03 | Glen J. Anderson | Selectively augmenting communications transmitted by a communication device |
US20180203584A1 (en) * | 2012-09-28 | 2018-07-19 | Intel Corporation | Selectively augmenting communications transmitted by a communication device |
US20220269392A1 (en) * | 2012-09-28 | 2022-08-25 | Intel Corporation | Selectively augmenting communications transmitted by a communication device |
US12105928B2 (en) * | 2012-09-28 | 2024-10-01 | Tahoe Research, Ltd. | Selectively augmenting communications transmitted by a communication device |
US9746990B2 (en) * | 2012-09-28 | 2017-08-29 | Intel Corporation | Selectively augmenting communications transmitted by a communication device |
US20180059885A1 (en) * | 2012-11-26 | 2018-03-01 | invi Labs, Inc. | System for and method of accessing and selecting emoticons, content, and mood messages during chat sessions |
US10824297B2 (en) * | 2012-11-26 | 2020-11-03 | Google Llc | System for and method of accessing and selecting emoticons, content, and mood messages during chat sessions |
US20160292148A1 (en) * | 2012-12-27 | 2016-10-06 | Touchtype Limited | System and method for inputting images or labels into electronic devices |
US11200503B2 (en) | 2012-12-27 | 2021-12-14 | Microsoft Technology Licensing, Llc | Search system and corresponding method |
US10664657B2 (en) * | 2012-12-27 | 2020-05-26 | Touchtype Limited | System and method for inputting images or labels into electronic devices |
US20140282212A1 (en) * | 2013-03-15 | 2014-09-18 | Gary Shuster | System, method, and apparatus that facilitates modifying a textual input |
US20140324414A1 (en) * | 2013-04-28 | 2014-10-30 | Tencent Technology (Shenzhen) Company Limited | Method and apparatus for displaying emoticon |
US11474688B2 (en) | 2013-08-26 | 2022-10-18 | Samsung Electronics Co., Ltd. | User device and method for creating handwriting content |
CN105518577A (en) * | 2013-08-26 | 2016-04-20 | 三星电子株式会社 | User device and method for creating handwriting content |
EP3039512A4 (en) * | 2013-08-26 | 2017-08-16 | Samsung Electronics Co., Ltd. | User device and method for creating handwriting content |
US10684771B2 (en) | 2013-08-26 | 2020-06-16 | Samsung Electronics Co., Ltd. | User device and method for creating handwriting content |
KR102262453B1 (en) * | 2013-10-03 | 2021-06-07 | 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 | Emoji for text predictions |
KR20160065174A (en) * | 2013-10-03 | 2016-06-08 | 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 | Emoji for text predictions |
WO2015050910A1 (en) * | 2013-10-03 | 2015-04-09 | Microsoft Corporation | Emoji for text predictions |
US20160210276A1 (en) * | 2013-10-24 | 2016-07-21 | Sony Corporation | Information processing device, information processing method, and program |
WO2015061700A1 (en) * | 2013-10-24 | 2015-04-30 | Tapz Communications, LLC | System for effectively communicating concepts |
US20150222576A1 (en) * | 2013-10-31 | 2015-08-06 | Hill-Rom Services, Inc. | Context-based message creation via user-selectable icons |
US9961026B2 (en) * | 2013-10-31 | 2018-05-01 | Intel Corporation | Context-based message creation via user-selectable icons |
US20150127753A1 (en) * | 2013-11-04 | 2015-05-07 | Meemo, Llc | Word Recognition and Ideograph or In-App Advertising System |
WO2015066610A1 (en) * | 2013-11-04 | 2015-05-07 | Meemo, Llc | Word recognition and ideograph or in-app advertising system |
US9152979B2 (en) | 2013-11-04 | 2015-10-06 | Meemo, Llc | Word recognition and ideograph or in-app advertising system |
US9317870B2 (en) | 2013-11-04 | 2016-04-19 | Meemo, Llc | Word recognition and ideograph or in-app advertising system |
CN104333688A (en) * | 2013-12-03 | 2015-02-04 | 广州三星通信技术研究有限公司 | Equipment and method for generating emoticon based on shot image |
WO2015087084A1 (en) * | 2013-12-12 | 2015-06-18 | Touchtype Limited | System and method for inputting images or labels into electronic devices |
CN105814519A (en) * | 2013-12-12 | 2016-07-27 | 触摸式有限公司 | System and method for inputting images or labels into electronic devices |
US10348664B2 (en) | 2013-12-13 | 2019-07-09 | Google Technology Holdings LLC | Method and system for achieving communications in a manner accounting for one or more user preferences or contexts |
US20150172242A1 (en) * | 2013-12-13 | 2015-06-18 | Motorola Mobility Llc | Method and System for Achieving Communications in a Manner Accounting for One or More User Preferences or Contexts |
US9674125B2 (en) * | 2013-12-13 | 2017-06-06 | Google Technology Holdings LLC | Method and system for achieving communications in a manner accounting for one or more user preferences or contexts |
US10349209B1 (en) | 2014-01-12 | 2019-07-09 | Investment Asset Holdings Llc | Location-based messaging |
US12041508B1 (en) | 2014-01-12 | 2024-07-16 | Investment Asset Holdings Llc | Location-based messaging |
US10080102B1 (en) | 2014-01-12 | 2018-09-18 | Investment Asset Holdings Llc | Location-based messaging |
US9866999B1 (en) | 2014-01-12 | 2018-01-09 | Investment Asset Holdings Llc | Location-based messaging |
US20150220774A1 (en) * | 2014-02-05 | 2015-08-06 | Facebook, Inc. | Ideograms for Captured Expressions |
US10009352B2 (en) * | 2014-02-05 | 2018-06-26 | Facebook, Inc. | Controlling access to ideograms |
US20170318024A1 (en) * | 2014-02-05 | 2017-11-02 | Facebook, Inc. | Controlling Access to Ideograms |
AU2014381692B2 (en) * | 2014-02-05 | 2019-02-28 | Facebook, Inc. | Ideograms based on sentiment analysis |
WO2015119605A1 (en) * | 2014-02-05 | 2015-08-13 | Facebook, Inc. | Ideograms based on sentiment analysis |
US10050926B2 (en) | 2014-02-05 | 2018-08-14 | Facebook, Inc. | Ideograms based on sentiment analysis |
US10013601B2 (en) * | 2014-02-05 | 2018-07-03 | Facebook, Inc. | Ideograms for captured expressions |
US9515968B2 (en) | 2014-02-05 | 2016-12-06 | Facebook, Inc. | Controlling access to ideograms |
US10140373B2 (en) * | 2014-04-15 | 2018-11-27 | Facebook, Inc. | Eliciting user sharing of content |
US10007404B2 (en) * | 2014-05-21 | 2018-06-26 | Ricoh Company, Ltd. | Terminal apparatus, program, method of calling function, and information processing system |
US20150339017A1 (en) * | 2014-05-21 | 2015-11-26 | Ricoh Company, Ltd. | Terminal apparatus, program, method of calling function, and information processing system |
US10572681B1 (en) | 2014-05-28 | 2020-02-25 | Snap Inc. | Apparatus and method for automated privacy protection in distributed images |
US10990697B2 (en) | 2014-05-28 | 2021-04-27 | Snap Inc. | Apparatus and method for automated privacy protection in distributed images |
US11972014B2 (en) | 2014-05-28 | 2024-04-30 | Snap Inc. | Apparatus and method for automated privacy protection in distributed images |
US10565219B2 (en) | 2014-05-30 | 2020-02-18 | Apple Inc. | Techniques for automatically generating a suggested contact based on a received message |
US10747397B2 (en) | 2014-05-30 | 2020-08-18 | Apple Inc. | Structured suggestions |
US10579212B2 (en) | 2014-05-30 | 2020-03-03 | Apple Inc. | Structured suggestions |
US10585559B2 (en) | 2014-05-30 | 2020-03-10 | Apple Inc. | Identifying contact information suggestions from a received message |
US10620787B2 (en) | 2014-05-30 | 2020-04-14 | Apple Inc. | Techniques for structuring suggested contacts and calendar events from messages |
US11921805B2 (en) | 2014-06-05 | 2024-03-05 | Snap Inc. | Web document enhancement |
US11625443B2 (en) | 2014-06-05 | 2023-04-11 | Snap Inc. | Web document enhancement |
US10524087B1 (en) | 2014-06-13 | 2019-12-31 | Snap Inc. | Message destination list mechanism |
US10659914B1 (en) | 2014-06-13 | 2020-05-19 | Snap Inc. | Geo-location based event gallery |
US10779113B2 (en) | 2014-06-13 | 2020-09-15 | Snap Inc. | Prioritization of messages within a message collection |
US10182311B2 (en) | 2014-06-13 | 2019-01-15 | Snap Inc. | Prioritization of messages within a message collection |
US10200813B1 (en) | 2014-06-13 | 2019-02-05 | Snap Inc. | Geo-location based event gallery |
US11317240B2 (en) | 2014-06-13 | 2022-04-26 | Snap Inc. | Geo-location based event gallery |
US10448201B1 (en) | 2014-06-13 | 2019-10-15 | Snap Inc. | Prioritization of messages within a message collection |
US10623891B2 (en) | 2014-06-13 | 2020-04-14 | Snap Inc. | Prioritization of messages within a message collection |
US11166121B2 (en) | 2014-06-13 | 2021-11-02 | Snap Inc. | Prioritization of messages within a message collection |
US10708203B2 (en) * | 2014-06-25 | 2020-07-07 | Convergence Acceleration Solutions, Llc | Systems and methods for indicating emotions through electronic self-portraits |
US20150381534A1 (en) * | 2014-06-25 | 2015-12-31 | Convergence Acceleration Solutions, Llc | Systems and methods for indicating emotions through electronic self-portraits |
CN106796583A (en) * | 2014-07-07 | 2017-05-31 | 机械地带有限公司 | System and method for recognizing and advising emoticon |
US10602057B1 (en) | 2014-07-07 | 2020-03-24 | Snap Inc. | Supplying content aware photo filters |
US11122200B2 (en) | 2014-07-07 | 2021-09-14 | Snap Inc. | Supplying content aware photo filters |
US9372608B2 (en) | 2014-07-07 | 2016-06-21 | Machine Zone, Inc. | Systems and methods for identifying and suggesting emoticons |
US10432850B1 (en) | 2014-07-07 | 2019-10-01 | Snap Inc. | Apparatus and method for supplying content aware photo filters |
US11595569B2 (en) | 2014-07-07 | 2023-02-28 | Snap Inc. | Supplying content aware photo filters |
US9043196B1 (en) | 2014-07-07 | 2015-05-26 | Machine Zone, Inc. | Systems and methods for identifying and suggesting emoticons |
US9690767B2 (en) | 2014-07-07 | 2017-06-27 | Machine Zone, Inc. | Systems and methods for identifying and suggesting emoticons |
JP2017527881A (en) * | 2014-07-07 | 2017-09-21 | マシーン・ゾーン・インコーポレイテッドMachine Zone, Inc. | System and method for identifying and proposing emoticons |
WO2016007122A1 (en) * | 2014-07-07 | 2016-01-14 | Machine Zone, Inc. | System and method for identifying and suggesting emoticons |
US10311139B2 (en) | 2014-07-07 | 2019-06-04 | Mz Ip Holdings, Llc | Systems and methods for identifying and suggesting emoticons |
US10154192B1 (en) | 2014-07-07 | 2018-12-11 | Snap Inc. | Apparatus and method for supplying content aware photo filters |
US10579717B2 (en) | 2014-07-07 | 2020-03-03 | Mz Ip Holdings, Llc | Systems and methods for identifying and inserting emoticons |
US11849214B2 (en) | 2014-07-07 | 2023-12-19 | Snap Inc. | Apparatus and method for supplying content aware photo filters |
US10176535B2 (en) * | 2014-09-05 | 2019-01-08 | Verizon Patent And Licensing Inc. | Method and system for providing social category indicators in a user profile header of an on-line posting |
US20160070453A1 (en) * | 2014-09-05 | 2016-03-10 | Verizon Patent And Licensing Inc. | Method and system for indicating social categories |
US11625755B1 (en) | 2014-09-16 | 2023-04-11 | Foursquare Labs, Inc. | Determining targeting information based on a predictive targeting model |
US10423983B2 (en) | 2014-09-16 | 2019-09-24 | Snap Inc. | Determining targeting information based on a predictive targeting model |
EP3195630A4 (en) * | 2014-09-18 | 2018-03-14 | Snap Inc. | Geolocation-based pictographs |
US10824654B2 (en) | 2014-09-18 | 2020-11-03 | Snap Inc. | Geolocation-based pictographs |
US11281701B2 (en) | 2014-09-18 | 2022-03-22 | Snap Inc. | Geolocation-based pictographs |
US11741136B2 (en) | 2014-09-18 | 2023-08-29 | Snap Inc. | Geolocation-based pictographs |
WO2016044424A1 (en) * | 2014-09-18 | 2016-03-24 | Snapchat, Inc. | Geolocation-based pictographs |
US11216869B2 (en) | 2014-09-23 | 2022-01-04 | Snap Inc. | User interface to augment an image using geolocation |
US20170374003A1 (en) | 2014-10-02 | 2017-12-28 | Snapchat, Inc. | Ephemeral gallery of ephemeral messages |
US12113764B2 (en) | 2014-10-02 | 2024-10-08 | Snap Inc. | Automated management of ephemeral message collections |
US11522822B1 (en) | 2014-10-02 | 2022-12-06 | Snap Inc. | Ephemeral gallery elimination based on gallery and message timers |
US11411908B1 (en) | 2014-10-02 | 2022-08-09 | Snap Inc. | Ephemeral message gallery user interface with online viewing history indicia |
US11038829B1 (en) | 2014-10-02 | 2021-06-15 | Snap Inc. | Ephemeral gallery of ephemeral messages with opt-in permanence |
US10476830B2 (en) | 2014-10-02 | 2019-11-12 | Snap Inc. | Ephemeral gallery of ephemeral messages |
US20200394627A1 (en) * | 2014-10-08 | 2020-12-17 | Jgist, Inc. | Universal Symbol System Language-One World Language |
US10769607B2 (en) * | 2014-10-08 | 2020-09-08 | Jgist, Inc. | Universal symbol system language-one world language |
US11190679B2 (en) | 2014-11-12 | 2021-11-30 | Snap Inc. | Accessing media at a geographic location |
US10616476B1 (en) | 2014-11-12 | 2020-04-07 | Snap Inc. | User interface for accessing media at a geographic location |
US11956533B2 (en) | 2014-11-12 | 2024-04-09 | Snap Inc. | Accessing media at a geographic location |
US20160154825A1 (en) * | 2014-11-27 | 2016-06-02 | Inventec (Pudong) Technology Corporation | Emotion image recommendation system and method thereof |
US9575996B2 (en) * | 2014-11-27 | 2017-02-21 | Inventec (Pudong) Technology Corporation | Emotion image recommendation system and method thereof |
US10580458B2 (en) | 2014-12-19 | 2020-03-03 | Snap Inc. | Gallery of videos set to an audio time line |
US11250887B2 (en) | 2014-12-19 | 2022-02-15 | Snap Inc. | Routing messages by message parameter |
US10811053B2 (en) | 2014-12-19 | 2020-10-20 | Snap Inc. | Routing messages by message parameter |
US11372608B2 (en) | 2014-12-19 | 2022-06-28 | Snap Inc. | Gallery of messages from individuals with a shared interest |
US11783862B2 (en) | 2014-12-19 | 2023-10-10 | Snap Inc. | Routing messages by message parameter |
US11803345B2 (en) | 2014-12-19 | 2023-10-31 | Snap Inc. | Gallery of messages from individuals with a shared interest |
US11734342B2 (en) | 2015-01-09 | 2023-08-22 | Snap Inc. | Object recognition based image overlays |
US12056182B2 (en) | 2015-01-09 | 2024-08-06 | Snap Inc. | Object recognition based image overlays |
US10157449B1 (en) | 2015-01-09 | 2018-12-18 | Snap Inc. | Geo-location-based image filters |
US10380720B1 (en) | 2015-01-09 | 2019-08-13 | Snap Inc. | Location-based image filters |
US11301960B2 (en) | 2015-01-09 | 2022-04-12 | Snap Inc. | Object recognition based image filters |
US11388226B1 (en) | 2015-01-13 | 2022-07-12 | Snap Inc. | Guided personal identity based actions |
US11962645B2 (en) | 2015-01-13 | 2024-04-16 | Snap Inc. | Guided personal identity based actions |
US11249617B1 (en) | 2015-01-19 | 2022-02-15 | Snap Inc. | Multichannel system |
US10652287B2 (en) | 2015-01-20 | 2020-05-12 | Tencent Technology (Shenzhen) Company Limited | Method, device, and system for managing information recommendation |
US11528579B2 (en) | 2015-01-26 | 2022-12-13 | Snap Inc. | Content request by location |
US11910267B2 (en) | 2015-01-26 | 2024-02-20 | Snap Inc. | Content request by location |
US10932085B1 (en) | 2015-01-26 | 2021-02-23 | Snap Inc. | Content request by location |
US10536800B1 (en) | 2015-01-26 | 2020-01-14 | Snap Inc. | Content request by location |
US10123166B2 (en) | 2015-01-26 | 2018-11-06 | Snap Inc. | Content request by location |
US10248850B2 (en) * | 2015-02-27 | 2019-04-02 | Immersion Corporation | Generating actions based on a user's mood |
US10223397B1 (en) | 2015-03-13 | 2019-03-05 | Snap Inc. | Social graph based co-location of network users |
US10616239B2 (en) | 2015-03-18 | 2020-04-07 | Snap Inc. | Geo-fence authorization provisioning |
CN104699662A (en) * | 2015-03-18 | 2015-06-10 | 北京交通大学 | Method and device for recognizing whole symbol string |
US11902287B2 (en) | 2015-03-18 | 2024-02-13 | Snap Inc. | Geo-fence authorization provisioning |
US10893055B2 (en) | 2015-03-18 | 2021-01-12 | Snap Inc. | Geo-fence authorization provisioning |
US10948717B1 (en) | 2015-03-23 | 2021-03-16 | Snap Inc. | Reducing boot time and power consumption in wearable display systems |
US11662576B2 (en) | 2015-03-23 | 2023-05-30 | Snap Inc. | Reducing boot time and power consumption in displaying data content |
US11320651B2 (en) | 2015-03-23 | 2022-05-03 | Snap Inc. | Reducing boot time and power consumption in displaying data content |
US11644953B2 (en) | 2015-04-02 | 2023-05-09 | Meta Platforms, Inc. | Techniques for context sensitive illustrated graphical user interface elements |
US11221736B2 (en) | 2015-04-02 | 2022-01-11 | Facebook, Inc. | Techniques for context sensitive illustrated graphical user interface elements |
US10353542B2 (en) * | 2015-04-02 | 2019-07-16 | Facebook, Inc. | Techniques for context sensitive illustrated graphical user interface elements |
US11496544B2 (en) | 2015-05-05 | 2022-11-08 | Snap Inc. | Story and sub-story navigation |
US11449539B2 (en) | 2015-05-05 | 2022-09-20 | Snap Inc. | Automated local story generation and curation |
US11392633B2 (en) | 2015-05-05 | 2022-07-19 | Snap Inc. | Systems and methods for automated local story generation and curation |
US10592574B2 (en) | 2015-05-05 | 2020-03-17 | Snap Inc. | Systems and methods for automated local story generation and curation |
US10911575B1 (en) | 2015-05-05 | 2021-02-02 | Snap Inc. | Systems and methods for story and sub-story navigation |
US11025565B2 (en) | 2015-06-07 | 2021-06-01 | Apple Inc. | Personalized prediction of responses for instant messaging |
US10993069B2 (en) | 2015-07-16 | 2021-04-27 | Snap Inc. | Dynamically adaptive media content delivery |
US11199941B2 (en) | 2015-08-10 | 2021-12-14 | Tung Inc. | Conversion and display of a user input |
US10528219B2 (en) | 2015-08-10 | 2020-01-07 | Tung Inc. | Conversion and display of a user input |
US10817898B2 (en) | 2015-08-13 | 2020-10-27 | Placed, Llc | Determining exposures to content presented by physical objects |
US11961116B2 (en) | 2015-08-13 | 2024-04-16 | Foursquare Labs, Inc. | Determining exposures to content presented by physical objects |
US10341826B2 (en) | 2015-08-14 | 2019-07-02 | Apple Inc. | Easy location sharing |
US12089121B2 (en) | 2015-08-14 | 2024-09-10 | Apple Inc. | Easy location sharing |
US11418929B2 (en) | 2015-08-14 | 2022-08-16 | Apple Inc. | Easy location sharing |
US10152207B2 (en) * | 2015-08-26 | 2018-12-11 | Xiaomi Inc. | Method and device for changing emoticons in a chat interface |
US11048873B2 (en) * | 2015-09-15 | 2021-06-29 | Apple Inc. | Emoji and canned responses |
US10445425B2 (en) * | 2015-09-15 | 2019-10-15 | Apple Inc. | Emoji and canned responses |
US20170083506A1 (en) * | 2015-09-21 | 2017-03-23 | International Business Machines Corporation | Suggesting emoji characters based on current contextual emotional state of user |
US9665567B2 (en) * | 2015-09-21 | 2017-05-30 | International Business Machines Corporation | Suggesting emoji characters based on current contextual emotional state of user |
EP3370136A4 (en) * | 2015-10-26 | 2018-10-10 | Baidu Online Network Technology (Beijing) Co., Ltd | Input data processing method, apparatus and device, and non-volatile computer storage medium |
US11315331B2 (en) | 2015-10-30 | 2022-04-26 | Snap Inc. | Image based tracking in augmented reality systems |
US10366543B1 (en) | 2015-10-30 | 2019-07-30 | Snap Inc. | Image based tracking in augmented reality systems |
US10733802B2 (en) | 2015-10-30 | 2020-08-04 | Snap Inc. | Image based tracking in augmented reality systems |
US11769307B2 (en) | 2015-10-30 | 2023-09-26 | Snap Inc. | Image based tracking in augmented reality systems |
US12079931B2 (en) | 2015-11-30 | 2024-09-03 | Snap Inc. | Image and point cloud based tracking and in augmented reality systems |
US10997783B2 (en) | 2015-11-30 | 2021-05-04 | Snap Inc. | Image and point cloud based tracking and in augmented reality systems |
US11599241B2 (en) | 2015-11-30 | 2023-03-07 | Snap Inc. | Network resource location linking and visual content sharing |
US11380051B2 (en) | 2015-11-30 | 2022-07-05 | Snap Inc. | Image and point cloud based tracking and in augmented reality systems |
US10474321B2 (en) | 2015-11-30 | 2019-11-12 | Snap Inc. | Network resource location linking and visual content sharing |
US11468615B2 (en) | 2015-12-18 | 2022-10-11 | Snap Inc. | Media overlay publication system |
US20170177554A1 (en) * | 2015-12-18 | 2017-06-22 | International Business Machines Corporation | Culturally relevant emoji character replacement |
US10354425B2 (en) | 2015-12-18 | 2019-07-16 | Snap Inc. | Method and system for providing context relevant media augmentation |
US11830117B2 (en) | 2015-12-18 | 2023-11-28 | Snap Inc | Media overlay publication system |
US10719233B2 (en) | 2016-01-18 | 2020-07-21 | Microsoft Technology Licensing, Llc | Arc keyboard layout |
US10628036B2 (en) | 2016-01-18 | 2020-04-21 | Microsoft Technology Licensing, Llc | Keyboard customization |
US11889381B2 (en) | 2016-02-26 | 2024-01-30 | Snap Inc. | Generation, curation, and presentation of media collections |
US10834525B2 (en) | 2016-02-26 | 2020-11-10 | Snap Inc. | Generation, curation, and presentation of media collections |
US11611846B2 (en) | 2016-02-26 | 2023-03-21 | Snap Inc. | Generation, curation, and presentation of media collections |
US11023514B2 (en) | 2016-02-26 | 2021-06-01 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections |
US11197123B2 (en) | 2016-02-26 | 2021-12-07 | Snap Inc. | Generation, curation, and presentation of media collections |
US10679389B2 (en) | 2016-02-26 | 2020-06-09 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections |
US10423240B2 (en) | 2016-02-29 | 2019-09-24 | Samsung Electronics Co., Ltd. | Predicting text input based on user demographic information and context information |
US10921903B2 (en) | 2016-02-29 | 2021-02-16 | Samsung Electronics Co., Ltd. | Predicting text input based on user demographic information and context information |
WO2017150860A1 (en) * | 2016-02-29 | 2017-09-08 | Samsung Electronics Co., Ltd. | Predicting text input based on user demographic information and context information |
US10476819B2 (en) * | 2016-03-31 | 2019-11-12 | Atlassian Pty Ltd | Systems and methods for providing controls in a messaging interface |
US20170289073A1 (en) * | 2016-03-31 | 2017-10-05 | Atlassian Pty Ltd | Systems and methods for providing controls in a messaging interface |
US11631276B2 (en) | 2016-03-31 | 2023-04-18 | Snap Inc. | Automated avatar generation |
US11494547B2 (en) | 2016-04-13 | 2022-11-08 | Microsoft Technology Licensing, Llc | Inputting images to electronic devices |
WO2017184213A1 (en) * | 2016-04-20 | 2017-10-26 | Google Inc. | Iconographic suggestions within a keyboard |
US20170308289A1 (en) * | 2016-04-20 | 2017-10-26 | Google Inc. | Iconographic symbol search within a graphical keyboard |
EP3403167A1 (en) * | 2016-04-20 | 2018-11-21 | Google LLC | Iconographic symbol search within a graphical keyboard |
US20180248821A1 (en) * | 2016-05-06 | 2018-08-30 | Tencent Technology (Shenzhen) Company Limited | Information pushing method, apparatus, and system, and computer storage medium |
US10791074B2 (en) * | 2016-05-06 | 2020-09-29 | Tencent Technology (Shenzhen) Company Limited | Information pushing method, apparatus, and system, and computer storage medium |
US20170351678A1 (en) * | 2016-06-03 | 2017-12-07 | Facebook, Inc. | Profile Suggestions |
US10372310B2 (en) | 2016-06-23 | 2019-08-06 | Microsoft Technology Licensing, Llc | Suppression of input images |
WO2017223011A1 (en) * | 2016-06-23 | 2017-12-28 | Microsoft Technology Licensing, Llc | Emoji prediction by suppression |
US11640625B2 (en) | 2016-06-28 | 2023-05-02 | Snap Inc. | Generation, curation, and presentation of media collections with automated advertising |
US10885559B1 (en) | 2016-06-28 | 2021-01-05 | Snap Inc. | Generation, curation, and presentation of media collections with automated advertising |
US11445326B2 (en) | 2016-06-28 | 2022-09-13 | Snap Inc. | Track engagement of media items |
US10327100B1 (en) | 2016-06-28 | 2019-06-18 | Snap Inc. | System to track engagement of media items |
US10165402B1 (en) | 2016-06-28 | 2018-12-25 | Snap Inc. | System to track engagement of media items |
US10430838B1 (en) | 2016-06-28 | 2019-10-01 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections with automated advertising |
US10785597B2 (en) | 2016-06-28 | 2020-09-22 | Snap Inc. | System to track engagement of media items |
US12033191B2 (en) | 2016-06-28 | 2024-07-09 | Snap Inc. | Generation, curation, and presentation of media collections with automated advertising |
US10735892B2 (en) | 2016-06-28 | 2020-08-04 | Snap Inc. | System to track engagement of media items |
US10506371B2 (en) | 2016-06-28 | 2019-12-10 | Snap Inc. | System to track engagement of media items |
US10219110B2 (en) | 2016-06-28 | 2019-02-26 | Snap Inc. | System to track engagement of media items |
US11080351B1 (en) | 2016-06-30 | 2021-08-03 | Snap Inc. | Automated content curation and communication |
US11895068B2 (en) | 2016-06-30 | 2024-02-06 | Snap Inc. | Automated content curation and communication |
US10387514B1 (en) | 2016-06-30 | 2019-08-20 | Snap Inc. | Automated content curation and communication |
US9723127B1 (en) * | 2016-07-12 | 2017-08-01 | Detrice Grayson | Emoticon scripture system |
US11509615B2 (en) | 2016-07-19 | 2022-11-22 | Snap Inc. | Generating customized electronic messaging graphics |
US10348662B2 (en) | 2016-07-19 | 2019-07-09 | Snap Inc. | Generating customized electronic messaging graphics |
CN106293120A (en) * | 2016-07-29 | 2017-01-04 | 维沃移动通信有限公司 | Expression input method and mobile terminal |
US12002232B2 (en) | 2016-08-30 | 2024-06-04 | Snap Inc. | Systems and methods for simultaneous localization and mapping |
US10210865B2 (en) | 2016-08-30 | 2019-02-19 | Beijing Baidu Netcom Science And Technology Co., Ltd. | Method and apparatus for inputting information |
CN106372059A (en) * | 2016-08-30 | 2017-02-01 | 北京百度网讯科技有限公司 | Information input method and information input device |
US11816853B2 (en) | 2016-08-30 | 2023-11-14 | Snap Inc. | Systems and methods for simultaneous localization and mapping |
EP3291224A1 (en) * | 2016-08-30 | 2018-03-07 | Beijing Baidu Netcom Science and Technology Co., Ltd | Method and apparatus for inputting information |
US10877629B2 (en) | 2016-10-13 | 2020-12-29 | Tung Inc. | Conversion and display of a user input |
US11843456B2 (en) | 2016-10-24 | 2023-12-12 | Snap Inc. | Generating and displaying customized avatars in media overlays |
US12113760B2 (en) | 2016-10-24 | 2024-10-08 | Snap Inc. | Generating and displaying customized avatars in media overlays |
US11876762B1 (en) | 2016-10-24 | 2024-01-16 | Snap Inc. | Generating and displaying customized avatars in media overlays |
US11750767B2 (en) | 2016-11-07 | 2023-09-05 | Snap Inc. | Selective identification and order of image modifiers |
US10623666B2 (en) | 2016-11-07 | 2020-04-14 | Snap Inc. | Selective identification and order of image modifiers |
US11233952B2 (en) | 2016-11-07 | 2022-01-25 | Snap Inc. | Selective identification and order of image modifiers |
US10592103B2 (en) * | 2016-11-22 | 2020-03-17 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
US12099707B2 (en) | 2016-12-09 | 2024-09-24 | Snap Inc. | Customized media overlays |
US11397517B2 (en) | 2016-12-09 | 2022-07-26 | Snap Inc. | Customized media overlays |
US10754525B1 (en) | 2016-12-09 | 2020-08-25 | Snap Inc. | Customized media overlays |
US10203855B2 (en) | 2016-12-09 | 2019-02-12 | Snap Inc. | Customized user-controlled media overlays |
US11616745B2 (en) | 2017-01-09 | 2023-03-28 | Snap Inc. | Contextual generation and selection of customized media content |
US12028301B2 (en) | 2017-01-09 | 2024-07-02 | Snap Inc. | Contextual generation and selection of customized media content |
US11870743B1 (en) | 2017-01-23 | 2024-01-09 | Snap Inc. | Customized digital avatar accessories |
US10915911B2 (en) | 2017-02-03 | 2021-02-09 | Snap Inc. | System to determine a price-schedule to distribute media content |
US11250075B1 (en) | 2017-02-17 | 2022-02-15 | Snap Inc. | Searching social media content |
US11861795B1 (en) | 2017-02-17 | 2024-01-02 | Snap Inc. | Augmented reality anamorphosis system |
US10319149B1 (en) | 2017-02-17 | 2019-06-11 | Snap Inc. | Augmented reality anamorphosis system |
US12050654B2 (en) | 2017-02-17 | 2024-07-30 | Snap Inc. | Searching social media content |
US11720640B2 (en) | 2017-02-17 | 2023-08-08 | Snap Inc. | Searching social media content |
US11189299B1 (en) | 2017-02-20 | 2021-11-30 | Snap Inc. | Augmented reality speech balloon system |
US11748579B2 (en) | 2017-02-20 | 2023-09-05 | Snap Inc. | Augmented reality speech balloon system |
US11670057B2 (en) | 2017-03-06 | 2023-06-06 | Snap Inc. | Virtual vision system |
US11037372B2 (en) | 2017-03-06 | 2021-06-15 | Snap Inc. | Virtual vision system |
US11961196B2 (en) | 2017-03-06 | 2024-04-16 | Snap Inc. | Virtual vision system |
US12047344B2 (en) | 2017-03-09 | 2024-07-23 | Snap Inc. | Restricted group content collection |
US10887269B1 (en) | 2017-03-09 | 2021-01-05 | Snap Inc. | Restricted group content collection |
US10523625B1 (en) | 2017-03-09 | 2019-12-31 | Snap Inc. | Restricted group content collection |
US11258749B2 (en) | 2017-03-09 | 2022-02-22 | Snap Inc. | Restricted group content collection |
US11558678B2 (en) | 2017-03-27 | 2023-01-17 | Snap Inc. | Generating a stitched data stream |
US11297399B1 (en) | 2017-03-27 | 2022-04-05 | Snap Inc. | Generating a stitched data stream |
US11349796B2 (en) | 2017-03-27 | 2022-05-31 | Snap Inc. | Generating a stitched data stream |
US11170393B1 (en) | 2017-04-11 | 2021-11-09 | Snap Inc. | System to calculate an engagement score of location based media content |
US11195018B1 (en) | 2017-04-20 | 2021-12-07 | Snap Inc. | Augmented reality typography personalization system |
US12033253B2 (en) | 2017-04-20 | 2024-07-09 | Snap Inc. | Augmented reality typography personalization system |
US10387730B1 (en) | 2017-04-20 | 2019-08-20 | Snap Inc. | Augmented reality typography personalization system |
US11842411B2 (en) | 2017-04-27 | 2023-12-12 | Snap Inc. | Location-based virtual avatars |
US11392264B1 (en) | 2017-04-27 | 2022-07-19 | Snap Inc. | Map-based graphical user interface for multi-type social media galleries |
US10963529B1 (en) | 2017-04-27 | 2021-03-30 | Snap Inc. | Location-based search mechanism in a graphical user interface |
US11418906B2 (en) | 2017-04-27 | 2022-08-16 | Snap Inc. | Selective location-based identity communication |
US12086381B2 (en) | 2017-04-27 | 2024-09-10 | Snap Inc. | Map-based graphical user interface for multi-type social media galleries |
US11409407B2 (en) | 2017-04-27 | 2022-08-09 | Snap Inc. | Map-based graphical user interface indicating geospatial activity metrics |
US11782574B2 (en) | 2017-04-27 | 2023-10-10 | Snap Inc. | Map-based graphical user interface indicating geospatial activity metrics |
US11451956B1 (en) | 2017-04-27 | 2022-09-20 | Snap Inc. | Location privacy management on map-based social media platforms |
US11893647B2 (en) | 2017-04-27 | 2024-02-06 | Snap Inc. | Location-based virtual avatars |
US11995288B2 (en) | 2017-04-27 | 2024-05-28 | Snap Inc. | Location-based search mechanism in a graphical user interface |
US11474663B2 (en) | 2017-04-27 | 2022-10-18 | Snap Inc. | Location-based search mechanism in a graphical user interface |
US11385763B2 (en) | 2017-04-27 | 2022-07-12 | Snap Inc. | Map-based graphical user interface indicating geospatial activity metrics |
US10952013B1 (en) | 2017-04-27 | 2021-03-16 | Snap Inc. | Selective location-based identity communication |
US11556221B2 (en) | 2017-04-27 | 2023-01-17 | Snap Inc. | Friend location sharing mechanism for social media platforms |
US12058583B2 (en) | 2017-04-27 | 2024-08-06 | Snap Inc. | Selective location-based identity communication |
US12112013B2 (en) | 2017-04-27 | 2024-10-08 | Snap Inc. | Location privacy management on map-based social media platforms |
US11232040B1 (en) | 2017-04-28 | 2022-01-25 | Snap Inc. | Precaching unlockable data elements |
US11675831B2 (en) | 2017-05-31 | 2023-06-13 | Snap Inc. | Geolocation based playlists |
US10318109B2 (en) | 2017-06-09 | 2019-06-11 | Microsoft Technology Licensing, Llc | Emoji suggester and adapted user interface |
WO2018226352A1 (en) * | 2017-06-09 | 2018-12-13 | Microsoft Technology Licensing, Llc | Emoji suggester and adapted user interface |
CN110741348A (en) * | 2017-06-09 | 2020-01-31 | 微软技术许可有限责任公司 | Emoticon advisor and adapted user interface |
US11620001B2 (en) * | 2017-06-29 | 2023-04-04 | Snap Inc. | Pictorial symbol prediction |
US11475254B1 (en) | 2017-09-08 | 2022-10-18 | Snap Inc. | Multimodal entity identification |
US11335067B2 (en) | 2017-09-15 | 2022-05-17 | Snap Inc. | Augmented reality system |
US10740974B1 (en) | 2017-09-15 | 2020-08-11 | Snap Inc. | Augmented reality system |
US11721080B2 (en) | 2017-09-15 | 2023-08-08 | Snap Inc. | Augmented reality system |
WO2019060351A1 (en) | 2017-09-21 | 2019-03-28 | Mz Ip Holdings, Llc | System and method for utilizing memory-efficient data structures for emoji suggestions |
US12010582B2 (en) | 2017-10-09 | 2024-06-11 | Snap Inc. | Context sensitive presentation of content |
US10499191B1 (en) | 2017-10-09 | 2019-12-03 | Snap Inc. | Context sensitive presentation of content |
US11617056B2 (en) | 2017-10-09 | 2023-03-28 | Snap Inc. | Context sensitive presentation of content |
US11006242B1 (en) | 2017-10-09 | 2021-05-11 | Snap Inc. | Context sensitive presentation of content |
US11423596B2 (en) * | 2017-10-23 | 2022-08-23 | Paypal, Inc. | System and method for generating emoji mashups with machine learning |
US11783113B2 (en) | 2017-10-23 | 2023-10-10 | Paypal, Inc. | System and method for generating emoji mashups with machine learning |
US11670025B2 (en) | 2017-10-30 | 2023-06-06 | Snap Inc. | Mobile-based cartographic control of display content |
US11030787B2 (en) | 2017-10-30 | 2021-06-08 | Snap Inc. | Mobile-based cartographic control of display content |
US11558327B2 (en) | 2017-12-01 | 2023-01-17 | Snap Inc. | Dynamic media overlay with smart widget |
US11943185B2 (en) | 2017-12-01 | 2024-03-26 | Snap Inc. | Dynamic media overlay with smart widget |
US11265273B1 (en) | 2017-12-01 | 2022-03-01 | Snap, Inc. | Dynamic media overlay with smart widget |
US12056454B2 (en) | 2017-12-22 | 2024-08-06 | Snap Inc. | Named entity recognition visual context and caption data |
US11017173B1 (en) | 2017-12-22 | 2021-05-25 | Snap Inc. | Named entity recognition visual context and caption data |
US11687720B2 (en) | 2017-12-22 | 2023-06-27 | Snap Inc. | Named entity recognition visual context and caption data |
US11487794B2 (en) | 2018-01-03 | 2022-11-01 | Snap Inc. | Tag distribution visualization system |
US10678818B2 (en) | 2018-01-03 | 2020-06-09 | Snap Inc. | Tag distribution visualization system |
US11983215B2 (en) | 2018-01-03 | 2024-05-14 | Snap Inc. | Tag distribution visualization system |
US11841896B2 (en) | 2018-02-13 | 2023-12-12 | Snap Inc. | Icon based tagging |
US11507614B1 (en) | 2018-02-13 | 2022-11-22 | Snap Inc. | Icon based tagging |
US11523159B2 (en) | 2018-02-28 | 2022-12-06 | Snap Inc. | Generating media content items based on location information |
US10979752B1 (en) | 2018-02-28 | 2021-04-13 | Snap Inc. | Generating media content items based on location information |
US10885136B1 (en) | 2018-02-28 | 2021-01-05 | Snap Inc. | Audience filtering system |
US11570572B2 (en) | 2018-03-06 | 2023-01-31 | Snap Inc. | Geo-fence selection system |
US11722837B2 (en) | 2018-03-06 | 2023-08-08 | Snap Inc. | Geo-fence selection system |
US11044574B2 (en) | 2018-03-06 | 2021-06-22 | Snap Inc. | Geo-fence selection system |
US10524088B2 (en) | 2018-03-06 | 2019-12-31 | Snap Inc. | Geo-fence selection system |
US10327096B1 (en) | 2018-03-06 | 2019-06-18 | Snap Inc. | Geo-fence selection system |
US11998833B2 (en) | 2018-03-14 | 2024-06-04 | Snap Inc. | Generating collectible items based on location information |
US10933311B2 (en) | 2018-03-14 | 2021-03-02 | Snap Inc. | Generating collectible items based on location information |
US11491393B2 (en) | 2018-03-14 | 2022-11-08 | Snap Inc. | Generating collectible items based on location information |
US12056441B2 (en) | 2018-03-30 | 2024-08-06 | Snap Inc. | Annotating a collection of media content items |
US11163941B1 (en) | 2018-03-30 | 2021-11-02 | Snap Inc. | Annotating a collection of media content items |
US11297463B2 (en) | 2018-04-18 | 2022-04-05 | Snap Inc. | Visitation tracking system |
US10681491B1 (en) | 2018-04-18 | 2020-06-09 | Snap Inc. | Visitation tracking system |
US12035198B2 (en) | 2018-04-18 | 2024-07-09 | Snap Inc. | Visitation tracking system |
US10448199B1 (en) | 2018-04-18 | 2019-10-15 | Snap Inc. | Visitation tracking system |
US10219111B1 (en) | 2018-04-18 | 2019-02-26 | Snap Inc. | Visitation tracking system |
US11683657B2 (en) | 2018-04-18 | 2023-06-20 | Snap Inc. | Visitation tracking system |
US10924886B2 (en) | 2018-04-18 | 2021-02-16 | Snap Inc. | Visitation tracking system |
US10779114B2 (en) | 2018-04-18 | 2020-09-15 | Snap Inc. | Visitation tracking system |
US11573679B2 (en) * | 2018-04-30 | 2023-02-07 | The Trustees of the California State University | Integration of user emotions for a smartphone or other communication device environment |
US10699104B2 (en) * | 2018-05-03 | 2020-06-30 | International Business Machines Corporation | Image obtaining based on emotional status |
US20190340425A1 (en) * | 2018-05-03 | 2019-11-07 | International Business Machines Corporation | Image obtaining based on emotional status |
US11103161B2 (en) | 2018-05-07 | 2021-08-31 | Apple Inc. | Displaying user interfaces associated with physical activities |
US11782575B2 (en) | 2018-05-07 | 2023-10-10 | Apple Inc. | User interfaces for sharing contextually relevant media content |
US11936601B1 (en) * | 2018-05-10 | 2024-03-19 | Whatsapp Llc | Modifying message content based on user preferences |
US11496425B1 (en) * | 2018-05-10 | 2022-11-08 | Whatsapp Llc | Modifying message content based on user preferences |
US11860888B2 (en) | 2018-05-22 | 2024-01-02 | Snap Inc. | Event detection system |
US11670026B2 (en) | 2018-07-24 | 2023-06-06 | Snap Inc. | Conditional modification of augmented reality object |
US10943381B2 (en) | 2018-07-24 | 2021-03-09 | Snap Inc. | Conditional modification of augmented reality object |
US10679393B2 (en) | 2018-07-24 | 2020-06-09 | Snap Inc. | Conditional modification of augmented reality object |
US12039649B2 (en) | 2018-07-24 | 2024-07-16 | Snap Inc. | Conditional modification of augmented reality object |
US11367234B2 (en) | 2018-07-24 | 2022-06-21 | Snap Inc. | Conditional modification of augmented reality object |
US10789749B2 (en) | 2018-07-24 | 2020-09-29 | Snap Inc. | Conditional modification of augmented reality object |
US11450050B2 (en) | 2018-08-31 | 2022-09-20 | Snap Inc. | Augmented reality anthropomorphization system |
US11676319B2 (en) | 2018-08-31 | 2023-06-13 | Snap Inc. | Augmented reality anthropomorphtzation system |
US10997760B2 (en) | 2018-08-31 | 2021-05-04 | Snap Inc. | Augmented reality anthropomorphization system |
US12105938B2 (en) | 2018-09-28 | 2024-10-01 | Snap Inc. | Collaborative achievement interface |
US11455082B2 (en) | 2018-09-28 | 2022-09-27 | Snap Inc. | Collaborative achievement interface |
US11704005B2 (en) | 2018-09-28 | 2023-07-18 | Snap Inc. | Collaborative achievement interface |
US11799811B2 (en) | 2018-10-31 | 2023-10-24 | Snap Inc. | Messaging and gaming applications communication platform |
US11558709B2 (en) | 2018-11-30 | 2023-01-17 | Snap Inc. | Position service to determine relative position to map features |
US11698722B2 (en) | 2018-11-30 | 2023-07-11 | Snap Inc. | Generating customized avatars based on location information |
US11199957B1 (en) | 2018-11-30 | 2021-12-14 | Snap Inc. | Generating customized avatars based on location information |
US10871877B1 (en) * | 2018-11-30 | 2020-12-22 | Facebook, Inc. | Content-based contextual reactions for posts on a social networking system |
US11812335B2 (en) | 2018-11-30 | 2023-11-07 | Snap Inc. | Position service to determine relative position to map features |
US11877211B2 (en) | 2019-01-14 | 2024-01-16 | Snap Inc. | Destination sharing in location sharing system |
US11751015B2 (en) | 2019-01-16 | 2023-09-05 | Snap Inc. | Location-based context information sharing in a messaging system |
US11294936B1 (en) | 2019-01-30 | 2022-04-05 | Snap Inc. | Adaptive spatial density based clustering |
US11693887B2 (en) | 2019-01-30 | 2023-07-04 | Snap Inc. | Adaptive spatial density based clustering |
US11972529B2 (en) | 2019-02-01 | 2024-04-30 | Snap Inc. | Augmented reality system |
US11809624B2 (en) | 2019-02-13 | 2023-11-07 | Snap Inc. | Sleep detection in a location sharing system |
US11500525B2 (en) | 2019-02-25 | 2022-11-15 | Snap Inc. | Custom media overlay system |
US11954314B2 (en) | 2019-02-25 | 2024-04-09 | Snap Inc. | Custom media overlay system |
US11574431B2 (en) | 2019-02-26 | 2023-02-07 | Snap Inc. | Avatar based on weather |
CN109918530A (en) * | 2019-03-04 | 2019-06-21 | 北京字节跳动网络技术有限公司 | Method and apparatus for pushing image |
US11301117B2 (en) | 2019-03-08 | 2022-04-12 | Snap Inc. | Contextual information in chat |
US11868414B1 (en) | 2019-03-14 | 2024-01-09 | Snap Inc. | Graph-based prediction for contact suggestion in a location sharing system |
US11852554B1 (en) | 2019-03-21 | 2023-12-26 | Snap Inc. | Barometer calibration in a location sharing system |
US11249614B2 (en) | 2019-03-28 | 2022-02-15 | Snap Inc. | Generating personalized map interface with enhanced icons |
US11740760B2 (en) | 2019-03-28 | 2023-08-29 | Snap Inc. | Generating personalized map interface with enhanced icons |
US12039658B2 (en) | 2019-04-01 | 2024-07-16 | Snap Inc. | Semantic texture mapping system |
US11361493B2 (en) | 2019-04-01 | 2022-06-14 | Snap Inc. | Semantic texture mapping system |
US10659405B1 (en) | 2019-05-06 | 2020-05-19 | Apple Inc. | Avatar integration with multiple applications |
US11963105B2 (en) | 2019-05-30 | 2024-04-16 | Snap Inc. | Wearable device location systems architecture |
US11785549B2 (en) | 2019-05-30 | 2023-10-10 | Snap Inc. | Wearable device location systems |
US11606755B2 (en) | 2019-05-30 | 2023-03-14 | Snap Inc. | Wearable device location systems architecture |
US11206615B2 (en) | 2019-05-30 | 2021-12-21 | Snap Inc. | Wearable device location systems |
US11347943B2 (en) | 2019-06-01 | 2022-05-31 | Apple Inc. | Mail application features |
US11620046B2 (en) | 2019-06-01 | 2023-04-04 | Apple Inc. | Keyboard management user interfaces |
US11842044B2 (en) | 2019-06-01 | 2023-12-12 | Apple Inc. | Keyboard management user interfaces |
US11194467B2 (en) | 2019-06-01 | 2021-12-07 | Apple Inc. | Keyboard management user interfaces |
US11074408B2 (en) | 2019-06-01 | 2021-07-27 | Apple Inc. | Mail application features |
US11917495B2 (en) | 2019-06-07 | 2024-02-27 | Snap Inc. | Detection of a physical collision between two client devices in a location sharing system |
US11601783B2 (en) | 2019-06-07 | 2023-03-07 | Snap Inc. | Detection of a physical collision between two client devices in a location sharing system |
US11714535B2 (en) | 2019-07-11 | 2023-08-01 | Snap Inc. | Edge gesture interface with smart interactions |
US11821742B2 (en) | 2019-09-26 | 2023-11-21 | Snap Inc. | Travel based notifications |
US11218838B2 (en) | 2019-10-31 | 2022-01-04 | Snap Inc. | Focused map-based context information surfacing |
US11128715B1 (en) | 2019-12-30 | 2021-09-21 | Snap Inc. | Physical friend proximity in chat |
US11429618B2 (en) | 2019-12-30 | 2022-08-30 | Snap Inc. | Surfacing augmented reality objects |
US11977553B2 (en) | 2019-12-30 | 2024-05-07 | Snap Inc. | Surfacing augmented reality objects |
US11343323B2 (en) | 2019-12-31 | 2022-05-24 | Snap Inc. | Augmented reality objects registry |
US11943303B2 (en) | 2019-12-31 | 2024-03-26 | Snap Inc. | Augmented reality objects registry |
US11893208B2 (en) | 2019-12-31 | 2024-02-06 | Snap Inc. | Combined map icon with action indicator |
US11888803B2 (en) | 2020-02-12 | 2024-01-30 | Snap Inc. | Multiple gateway message exchange |
US11228551B1 (en) | 2020-02-12 | 2022-01-18 | Snap Inc. | Multiple gateway message exchange |
US11516167B2 (en) | 2020-03-05 | 2022-11-29 | Snap Inc. | Storing data based on device location |
US11765117B2 (en) | 2020-03-05 | 2023-09-19 | Snap Inc. | Storing data based on device location |
US11619501B2 (en) | 2020-03-11 | 2023-04-04 | Snap Inc. | Avatar based on trip |
US11430091B2 (en) | 2020-03-27 | 2022-08-30 | Snap Inc. | Location mapping for large scale augmented-reality |
US11915400B2 (en) | 2020-03-27 | 2024-02-27 | Snap Inc. | Location mapping for large scale augmented-reality |
US11776256B2 (en) | 2020-03-27 | 2023-10-03 | Snap Inc. | Shared augmented reality system |
US11503432B2 (en) | 2020-06-15 | 2022-11-15 | Snap Inc. | Scalable real-time location sharing framework |
US11314776B2 (en) | 2020-06-15 | 2022-04-26 | Snap Inc. | Location sharing using friend list versions |
US11290851B2 (en) | 2020-06-15 | 2022-03-29 | Snap Inc. | Location sharing using offline and online objects |
US11483267B2 (en) | 2020-06-15 | 2022-10-25 | Snap Inc. | Location sharing using different rate-limited links |
US20220269354A1 (en) * | 2020-06-19 | 2022-08-25 | Talent Unlimited Online Services Private Limited | Artificial intelligence-based system and method for dynamically predicting and suggesting emojis for messages |
US12062235B2 (en) | 2020-06-29 | 2024-08-13 | Snap Inc. | Providing travel-based augmented reality content with a captured image |
US11676378B2 (en) | 2020-06-29 | 2023-06-13 | Snap Inc. | Providing travel-based augmented reality content with a captured image |
US12127068B2 (en) | 2020-07-30 | 2024-10-22 | Investment Asset Holdings Llc | Map interface with icon for location-based messages |
US11943192B2 (en) | 2020-08-31 | 2024-03-26 | Snap Inc. | Co-location connection service |
US20220078295A1 (en) * | 2020-09-08 | 2022-03-10 | Fujifilm Business Innovation Corp. | Information processing apparatus and system and non-transitory computer readable medium |
US11979525B2 (en) * | 2020-09-08 | 2024-05-07 | Fujifilm Business Innovation Corp. | Information processing apparatus and system and non-transitory computer readable medium |
US11044218B1 (en) * | 2020-10-23 | 2021-06-22 | Slack Technologies, Inc. | Systems and methods for reacting to messages |
CN112540756A (en) * | 2020-12-01 | 2021-03-23 | 杭州讯酷科技有限公司 | UI (user interface) construction method based on cursor position recommendation field |
US20220191158A1 (en) * | 2020-12-15 | 2022-06-16 | Kakao Corp. | Method and server for providing content list and operating method of user terminal |
US12034681B2 (en) * | 2020-12-15 | 2024-07-09 | Kakao Corp. | Method and server for providing content list and operating method of user terminal |
CN114816599A (en) * | 2021-01-22 | 2022-07-29 | 北京字跳网络技术有限公司 | Image display method, apparatus, device and medium |
US20230410394A1 (en) * | 2021-01-22 | 2023-12-21 | Beijing Zitiao Network Technology Co., Ltd. | Image display method and apparatus, device, and medium |
US12106410B2 (en) * | 2021-01-22 | 2024-10-01 | Beijing Zitiao Network Technology Co., Ltd. | Customizing emojis for users in chat applications |
US12003811B2 (en) | 2021-03-01 | 2024-06-04 | Comcast Cable Communications, Llc | Systems and methods for providing contextually relevant information |
US11516539B2 (en) * | 2021-03-01 | 2022-11-29 | Comcast Cable Communications, Llc | Systems and methods for providing contextually relevant information |
US20220279240A1 (en) * | 2021-03-01 | 2022-09-01 | Comcast Cable Communications, Llc | Systems and methods for providing contextually relevant information |
US11606756B2 (en) | 2021-03-29 | 2023-03-14 | Snap Inc. | Scheduling requests for location data |
US11902902B2 (en) | 2021-03-29 | 2024-02-13 | Snap Inc. | Scheduling requests for location data |
US11601888B2 (en) | 2021-03-29 | 2023-03-07 | Snap Inc. | Determining location using multi-source geolocation data |
US11645324B2 (en) | 2021-03-31 | 2023-05-09 | Snap Inc. | Location-based timeline media content system |
US12026362B2 (en) | 2021-05-19 | 2024-07-02 | Snap Inc. | Video editing application for mobile devices |
US20230004277A1 (en) * | 2021-06-30 | 2023-01-05 | Slack Technologies, Inc. | User interface for searching content of a communication platform using reaction icons |
US11822764B2 (en) | 2021-06-30 | 2023-11-21 | Salesforce, Inc. | User interface for searching content of a communication platform using reaction icons |
US20230400961A1 (en) * | 2021-06-30 | 2023-12-14 | Salesforce, Inc. | User interface for searching content of a communication platform using reaction icons |
US11561673B1 (en) * | 2021-06-30 | 2023-01-24 | Salesforce, Inc. | User interface for searching content of a communication platform using reaction icons |
US11765115B2 (en) * | 2021-07-29 | 2023-09-19 | Snap Inc. | Emoji recommendation system using user context and biosignals |
US20230035961A1 (en) * | 2021-07-29 | 2023-02-02 | Fannie Liu | Emoji recommendation system using user context and biosignals |
US20230123271A1 (en) * | 2021-10-20 | 2023-04-20 | International Business Machines Corporation | Decoding communications with token sky maps |
US11960845B2 (en) * | 2021-10-20 | 2024-04-16 | International Business Machines Corporation | Decoding communications with token sky maps |
US11829834B2 (en) | 2021-10-29 | 2023-11-28 | Snap Inc. | Extended QR code |
US12001750B2 (en) | 2022-04-20 | 2024-06-04 | Snap Inc. | Location-based shared augmented reality experience system |
US12020384B2 (en) | 2022-06-21 | 2024-06-25 | Snap Inc. | Integrating augmented reality experiences with other components |
US12020386B2 (en) | 2022-06-23 | 2024-06-25 | Snap Inc. | Applying pregenerated virtual experiences in new location |
US20240220717A1 (en) * | 2022-12-28 | 2024-07-04 | Twilio Inc. | Adding theme-based content to messages using artificial intelligence |
US12131003B2 (en) | 2023-05-12 | 2024-10-29 | Snap Inc. | Map-based graphical user interface indicating geospatial activity metrics |
Also Published As
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2016253602B2 (en) | Systems and methods for identifying and suggesting emoticons | |
US10579717B2 (en) | Systems and methods for identifying and inserting emoticons | |
WO2016007122A1 (en) | System and method for identifying and suggesting emoticons | |
NZ713913B2 (en) | Systems and methods for identifying and suggesting emoticons | |
NZ713912B2 (en) | Systems and methods for identifying and suggesting emoticons | |
JP2019153338A (en) | System and method for identifying and proposing emoticon |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: ADDMIRED, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LEYDON, GABRIEL;REEL/FRAME:028485/0937 Effective date: 20120702 |
|
AS | Assignment |
Owner name: MACHINE ZONE, INC., CALIFORNIA Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE NAME OF THE ASSIGNEE PREVIOUSLY RECORDED ON REEL 028485 FRAME 0937. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT TO MACHINE ZONE, INC. (FORMERLY ADDMIRED, INC.);ASSIGNOR:LEYDON, GABRIEL;REEL/FRAME:030342/0292 Effective date: 20120702 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: MGG INVESTMENT GROUP LP, AS COLLATERAL AGENT, NEW YORK Free format text: NOTICE OF SECURITY INTEREST -- PATENTS;ASSIGNORS:MACHINE ZONE, INC.;SATORI WORLDWIDE, LLC;COGNANT LLC;REEL/FRAME:045237/0861 Effective date: 20180201 Owner name: MGG INVESTMENT GROUP LP, AS COLLATERAL AGENT, NEW Free format text: NOTICE OF SECURITY INTEREST -- PATENTS;ASSIGNORS:MACHINE ZONE, INC.;SATORI WORLDWIDE, LLC;COGNANT LLC;REEL/FRAME:045237/0861 Effective date: 20180201 |
|
AS | Assignment |
Owner name: MZ IP HOLDINGS, LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MACHINE ZONE, INC.;REEL/FRAME:045786/0179 Effective date: 20180320 |
|
AS | Assignment |
Owner name: COMERICA BANK, MICHIGAN Free format text: SECURITY INTEREST;ASSIGNOR:MZ IP HOLDINGS, LLC;REEL/FRAME:046215/0207 Effective date: 20180201 |
|
AS | Assignment |
Owner name: COGNANT LLC, CALIFORNIA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:MGG INVESTMENT GROUP LP, AS COLLATERAL AGENT;REEL/FRAME:052706/0917 Effective date: 20200519 Owner name: MACHINE ZONE, INC., CALIFORNIA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:MGG INVESTMENT GROUP LP, AS COLLATERAL AGENT;REEL/FRAME:052706/0917 Effective date: 20200519 Owner name: MZ IP HOLDINGS, LLC, CALIFORNIA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:COMERICA BANK;REEL/FRAME:052706/0899 Effective date: 20200519 Owner name: SATORI WORLDWIDE, LLC, CALIFORNIA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:MGG INVESTMENT GROUP LP, AS COLLATERAL AGENT;REEL/FRAME:052706/0917 Effective date: 20200519 |