US20150220734A1 - Mobile application management - Google Patents
Mobile application management Download PDFInfo
- Publication number
- US20150220734A1 US20150220734A1 US14/126,866 US201314126866A US2015220734A1 US 20150220734 A1 US20150220734 A1 US 20150220734A1 US 201314126866 A US201314126866 A US 201314126866A US 2015220734 A1 US2015220734 A1 US 2015220734A1
- Authority
- US
- United States
- Prior art keywords
- application
- behaviors
- user
- code
- behavior
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000006399 behavior Effects 0.000 claims abstract description 171
- 238000004458 analytical method Methods 0.000 claims abstract description 75
- 230000003542 behavioural effect Effects 0.000 claims description 43
- 238000000034 method Methods 0.000 claims description 35
- 238000005067 remediation Methods 0.000 claims description 25
- 230000009471 action Effects 0.000 claims description 19
- 238000009434 installation Methods 0.000 claims description 6
- 230000000903 blocking effect Effects 0.000 claims description 3
- 239000008186 active pharmaceutical agent Substances 0.000 description 42
- 230000035876 healing Effects 0.000 description 34
- 238000010586 diagram Methods 0.000 description 26
- 238000004590 computer program Methods 0.000 description 12
- 238000012545 processing Methods 0.000 description 12
- 238000004891 communication Methods 0.000 description 11
- 230000008569 process Effects 0.000 description 11
- 230000035945 sensitivity Effects 0.000 description 8
- 238000004422 calculation algorithm Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 6
- 230000004044 response Effects 0.000 description 6
- 230000004913 activation Effects 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 238000001514 detection method Methods 0.000 description 4
- 230000003993 interaction Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000003213 activating effect Effects 0.000 description 3
- 239000002131 composite material Substances 0.000 description 3
- 230000007717 exclusion Effects 0.000 description 3
- 230000000644 propagated effect Effects 0.000 description 3
- 238000012552 review Methods 0.000 description 3
- 238000011161 development Methods 0.000 description 2
- 238000007689 inspection Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000013515 script Methods 0.000 description 2
- 238000000926 separation method Methods 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- SPBWHPXCWJLQRU-FITJORAGSA-N 4-amino-8-[(2r,3r,4s,5r)-3,4-dihydroxy-5-(hydroxymethyl)oxolan-2-yl]-5-oxopyrido[2,3-d]pyrimidine-6-carboxamide Chemical compound C12=NC=NC(N)=C2C(=O)C(C(=O)N)=CN1[C@@H]1O[C@H](CO)[C@@H](O)[C@H]1O SPBWHPXCWJLQRU-FITJORAGSA-N 0.000 description 1
- 102100039986 Apoptosis inhibitor 5 Human genes 0.000 description 1
- 102100021677 Baculoviral IAP repeat-containing protein 2 Human genes 0.000 description 1
- 102100021662 Baculoviral IAP repeat-containing protein 3 Human genes 0.000 description 1
- 102100026862 CD5 antigen-like Human genes 0.000 description 1
- 101100264195 Caenorhabditis elegans app-1 gene Proteins 0.000 description 1
- 235000006719 Cassia obtusifolia Nutrition 0.000 description 1
- 235000014552 Cassia tora Nutrition 0.000 description 1
- 244000201986 Cassia tora Species 0.000 description 1
- 102100037024 E3 ubiquitin-protein ligase XIAP Human genes 0.000 description 1
- 101000959871 Homo sapiens Apoptosis inhibitor 5 Proteins 0.000 description 1
- 101000896157 Homo sapiens Baculoviral IAP repeat-containing protein 2 Proteins 0.000 description 1
- 101000896224 Homo sapiens Baculoviral IAP repeat-containing protein 3 Proteins 0.000 description 1
- 101000911996 Homo sapiens CD5 antigen-like Proteins 0.000 description 1
- 101000804865 Homo sapiens E3 ubiquitin-protein ligase XIAP Proteins 0.000 description 1
- 230000004931 aggregating effect Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 238000013480 data collection Methods 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000003467 diminishing effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000012634 fragment Substances 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 238000009413 insulation Methods 0.000 description 1
- 230000009545 invasion Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000033001 locomotion Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 229920001690 polydopamine Polymers 0.000 description 1
- 238000012805 post-processing Methods 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 239000000047 product Substances 0.000 description 1
- 230000002035 prolonged effect Effects 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 239000011800 void material Substances 0.000 description 1
- 238000005406 washing Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
- G06F21/55—Detecting local intrusion or implementing counter-measures
- G06F21/56—Computer malware detection or handling, e.g. anti-virus arrangements
- G06F21/562—Static detection
- G06F21/563—Static detection by source code analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/40—Transformation of program code
- G06F8/41—Compilation
- G06F8/43—Checking; Contextual analysis
- G06F8/436—Semantic checking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
- G06F21/51—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems at application loading time, e.g. accepting, rejecting, starting or inhibiting executable software based on integrity or source reliability
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
- G06F21/55—Detecting local intrusion or implementing counter-measures
- G06F21/552—Detecting local intrusion or implementing counter-measures involving long-term monitoring or reporting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/62—Protecting access to data via a platform, e.g. using keys or access control rules
- G06F21/6218—Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2221/00—Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/03—Indexing scheme relating to G06F21/50, monitoring users, programs or devices to maintain the integrity of platforms
- G06F2221/033—Test or assess software
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2221/00—Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/21—Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/2105—Dual mode as a secondary aspect
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2221/00—Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/21—Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/2111—Location-sensitive, e.g. geographical location, GPS
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2221/00—Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/21—Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/2113—Multi-level security, e.g. mandatory access control
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2221/00—Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/21—Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/2149—Restricted operating environment
Definitions
- This disclosure relates in general to the field of computer security and, more particularly, to security of mobile devices.
- app stores and other open marketplaces have enabled the development of tens of thousands of applications (or “apps”) that have been developed for such devices, including device platforms such as Google AndroidTM, iOSTM, WindowsTM, etc., with some of these applications being of questionable quality and purpose.
- FIG. 1 is a simplified schematic diagram of an example system including an application management system in accordance with one embodiment
- FIG. 2 is a simplified block diagram of an example system including an example application manager and user device in accordance with one embodiment
- FIG. 3 is a simplified block diagram representing analysis and healing of an application for a user device in accordance with one embodiment
- FIG. 4 is a simplified block diagram representing an example behavioral assessment of an application in accordance with one embodiment
- FIGS. 5A-5B are simplified representation of control flow within example applications in accordance with some embodiments.
- FIG. 6 is a simplified block diagram representing example subsystems accessible to an example user device in accordance with some embodiments
- FIG. 7 is a simplified block diagram representing use of rules to determine application behaviors in accordance with some embodiments.
- FIG. 8 is a simplified flow diagram representing assessment of application behaviors and healing of undesired behaviors in accordance with one embodiment
- FIG. 9 is a simplified flow diagram representing decisions made in connection with the management and remediation of applications determined to include undesirable behaviors based on behavioral analyses of the applications in accordance with one embodiment
- FIG. 10 is a simplified flow diagram representing an example healing of an application in accordance with one embodiment
- FIG. 11 is a simplified block diagram representing an example healing of an application in accordance with one embodiment
- FIGS. 12A-12E represent examples of detection and remediation of undesired behaviors of an application in accordance with some embodiments
- FIG. 13 is a simplified flow diagram representing an example healing of an application in accordance with one embodiment
- FIGS. 14A-14B are simplified block diagram representing features of an example mode manager in accordance with some embodiments.
- FIGS. 15A-15B represent portions of example algorithms for managing modes in a user device in accordance with some embodiments
- FIG. 16 is a simplified block diagram for sharing device modes between devices in accordance with one embodiment
- FIG. 17 is a simplified flow diagram illustrating use of context in managing modes of a device in accordance with one embodiment
- FIG. 18 is a simplified flow diagram illustrating remote provisioning and/or activation of modes on a user device in accordance with some embodiments
- FIG. 19 is a simplified block diagram representing application information collected in accordance with some embodiments.
- FIGS. 20A-20D are screenshots of example user interfaces provided in connection with mode management of a user device in accordance with some embodiments
- FIGS. 21A-21C are flowcharts representing example operations involving an example application management system in accordance with some embodiments.
- FIG. 1 illustrates an example system 100 including, for instance, an example application management server 105 , and one or more mobile user devices 110 , 115 , 120 , 125 , such as smart phones, mobile gaming systems, tablet computers, laptops, netbooks, among other examples.
- Application management server 105 can provide one or more services to the user devices to assist in the management of applications downloaded, installed, used, or otherwise provided for the user devices 110 , 115 , 120 , 125 .
- User devices 110 , 115 , 120 , 125 can access application servers 140 , such as centralized application storefronts, such as, for example, Android MarketTM, iTunesTM, and other examples.
- Application servers 140 can further include, in some examples, other sources of software applications that can be downloaded and installed on user devices 110 , 115 , 120 , 125 .
- User devices 110 , 115 , 120 , 125 can communicate with and consume the data and services of the application management server 105 over one or more networks 130 , including local area networks and wide area networks such as the Internet.
- networks 130 including local area networks and wide area networks such as the Internet.
- applications available to user devices 110 , 115 , 120 , 125 can be analyzed, assessed, and repaired at least in part by functionality provided through application management server 105 .
- application management server 105 in connection with services made available to user devices 110 , 115 , 120 , 125 can interact with and consume resources, data, and services of other outside systems and servers such as information servers 145 .
- information servers 145 can host services and data that provide additional intelligence and context regarding applications available to user devices 110 , 115 , 120 , 125 , among other examples.
- servers can include electronic computing devices operable to receive, transmit, process, store, or manage data and information associated with the computing environment 100 .
- computer can include processors operable to receive, transmit, process, store, or manage data and information associated with the computing environment 100 .
- processor processor device
- processing device is intended to encompass any suitable processing device.
- elements shown as single devices within the computing environment 100 may be implemented using a plurality of computing devices and processors, such as server pools including multiple server computers.
- any, all, or some of the computing devices may be adapted to execute any operating system, including LinuxTM, UNIXTM, Microsoft WindowsTM, Apple OSTM, Apple iOSTM, Google AndroidTM, Windows ServerTM, etc., as well as virtual machines adapted to virtualize execution of a particular operating system, including customized and proprietary operating systems.
- servers, user devices, network elements, systems, and other computing devices can each include one or more processors, computer-readable memory, and one or more interfaces, among other features and hardware.
- Servers can include any suitable software component or module, or computing device(s) capable of hosting and/or serving software applications and services (e.g., personal safety systems, services and applications of server 105 , etc.), including distributed, enterprise, or cloud-based software applications, data, and services.
- an application management server 105 can be comprised at least in part by cloud-implemented systems configured to remotely host, serve, or otherwise manage data, software services and applications interfacing, coordinating with, dependent on, or otherwise used by other services and devices in system 100 .
- a server, system, subsystem, or computing device can be implemented as some combination of devices that can be hosted on a common computing system, server, server pool, or cloud computing environment and share computing resources, including shared memory, processors, and interfaces.
- User, endpoint, or client computing devices can include traditional and mobile computing devices, including personal computers, laptop computers, tablet computers, smartphones, personal digital assistants, feature phones, handheld video game consoles, desktop computers, internet-enabled televisions, and other devices designed to interface with human users and capable of communicating with other devices over one or more networks (e.g., 130 ).
- Computer-assisted, or “smart,” appliances can include household and industrial devices and machines that include computer processors and are controlled, monitored, assisted, supplemented, or otherwise enhance the functionality of the devices by the computer processor, other hardware, and/or one or more software programs executed by the computer processor.
- Computer-assisted appliances can include a wide-variety of computer-assisted machines and products including refrigerators, washing machines, automobiles, HVAC systems, industrial machinery, ovens, security systems, and so on.
- Attributes of user computing devices, computer-assisted appliances, servers, and computing devices can vary widely from device to device, including the respective operating systems and collections of software programs loaded, installed, executed, operated, or otherwise accessible to each device.
- computing devices can run, execute, have installed, or otherwise include various sets of programs, including various combinations of operating systems, applications, plug-ins, applets, virtual machines, machine images, drivers, executable files, and other software-based programs capable of being run, executed, or otherwise used by the respective devices.
- Some system devices can further include at least one graphical display device and user interfaces, supported by computer processors of the system devices, that allow a user to view and interact with graphical user interfaces of applications and other programs provided in system, including user interfaces and graphical representations of programs interacting with applications hosted within the system devices as well as graphical user interfaces associated with application management server services and other applications, etc.
- system devices may be described in terms of being used by one user, this disclosure contemplates that many users may use one computer or that one user may use multiple computers.
- FIG. 1 is described as containing or being associated with a plurality of elements, not all elements illustrated within computing environment 100 of FIG. 1 may be utilized in each alternative implementation of the present disclosure. Additionally, one or more of the elements described in connection with the examples of FIG. 1 may be located external to computing environment 100 , while in other instances, certain elements may be included within or as a portion of one or more of the other described elements, as well as other elements not described in the illustrated implementation. Further, certain elements illustrated in FIG. 1 may be combined with other components, as well as used for alternative or additional purposes in addition to those purposes described herein.
- an example system including an application manager 205 , user system 210 , among other computing devices and network elements including, for instance, application servers 140 and information servers 145 communicating over one or more networks 130 .
- application manager 205 may include one or more processor devices 215 , memory elements 218 , and one or more other software and/or hardware-implemented components.
- an application manager 205 may include a share engine 220 , user manager 222 , healing engine 225 , behavior analysis engine 228 , application intelligence engine 230 , among other potential machine executable logic, components and functionality including combinations of the foregoing.
- a share engine 220 can be configured to provide functionality for managing crowdsourcing of information relating to applications (e.g., made available by application servers 140 ), as well as the sharing of such information and resources, including resources generated at least in part by or collected by application manager 205 .
- an example share engine 220 can allow modified applications 232 developed for particular users and associated user devices (e.g., 210 ) as well as defined application modes 240 to be shared across multiple user devices (e.g., 210 ), among other examples.
- An example user manager 222 can provide functionality for managing user accounts of various user devices (e.g., 210 ) that consume or otherwise make use of services of application manager 205 .
- An example user manager 222 can associate various modified applications 232 , application data and feedback data (e.g., 235 ), and application modes 240 , including application modes developed or modified by particular users with one or more user accounts and user devices (e.g., 210 ) in a system, among other examples.
- application data and feedback data e.g., 235
- application modes 240 including application modes developed or modified by particular users with one or more user accounts and user devices (e.g., 210 ) in a system, among other examples.
- An application manager 205 can, in some implementations, additionally include components, engines, and modules capable of providing application management, security, and diagnostic services to one or more user devices (e.g., 210 ) in connection with user device attempts to download, install, activate, or otherwise use or procure various applications including applications provided through one or more application servers (e.g., 140 ).
- application manager 205 can include an example behavior analysis engine 228 adapted to analyze and identify functionality of various applications made available to user devices on the system. Further, functionality of applications can be identified, for instance, by behavior analysis engine 228 , that users or administrators may wish to block, limit, repair, or modify, among other examples.
- an example application manager 205 can include an example healing engine 225 configured to modify applications on behalf of users to eliminate undesirable application features detected, for example, by behavior analysis engine 228 and thereby generate modified applications 232 .
- Modified applications 232 can, in some examples, be specifically modified and configured based on the requests, rules, settings, and preferences of a corresponding user.
- application manager 205 may include an application intelligence engine 230 configured to collect application data (e.g., 235 ), for instance, from information servers 145 and other sources both internal and external application manager 205 and its client user devices (e.g., 210 ).
- An application intelligence engine 230 can be used to collect intelligence regarding one or more applications served, for instance, by application servers 144 . The intelligence can be used in connection with services provided by application manager 205 , such as behavior analysis and assessments of applications by application manager 205 , among other examples.
- a user device may include one or more processor devices 242 and one or more memory elements 245 as well as one or more other software- and/or hardware-implemented components including, for example, a mode manager 248 , settings manager 252 , security tools 250 , and one or more applications 255 (e.g., procured through application servers 140 ).
- a user device 210 can include a mode manager 248 that is equipped with functionality for defining, enforcing, and otherwise managing multiple application access modes 265 on the user device 210 .
- Mode rules 270 can additionally be managed by mode manager 248 , the mode rules 270 defining, for instance, particular conditions for automatically initiating or enforcing various modes 265 on the user device 210 .
- one or more settings 260 can be defined by users, for instance, through an example settings manager 252 , the setting corresponding to and in some cases used in connection with various modes 265 of the device 210 , among other examples.
- a behavior monitor 228 can assess applications to identify whether one or more functions and/or content of an application are good, bad, suspect, or of unknown quality, among other examples.
- the assessment can be based on information acquired from a variety of sources (e.g., 145 ), such as information servers, user feedback, and other sources.
- sources e.g., 145
- an application healing engine 225 can be engaged to modify the application and remediate the identified undesirable functionality to generate a modified application file 232 corresponding to a healed version of the application.
- suspect or unknown applications can be designated, for instance, by a mode manager 248 , to be dedicated to a particular limited access mode of the user device 210 so as to, in effect, quarantine the suspect application until more intelligence is acquired regarding the application's functionality.
- the application may instead be allowed to proceed for installation on a user device.
- applications which have been healed to generate a modified application file can allow for the modified application to proceed to the user device for installation on the device, among other examples.
- FIG. 4 includes a block diagram 400 illustrating example principles and activities enabled through an example application behavior analysis engine.
- Application binaries 405 can be accessed or received by a disassembler data/control flow analyzer 410 which, in combination with ambient application knowledge 415 (e.g., collected from outside information sources as well as users, reviewers, etc.) such as application descriptions, reviews, comments, and other structured and unstructured data, can develop a model of the application logic 420 for each application binary 405 .
- ambient application knowledge 415 e.g., collected from outside information sources as well as users, reviewers, etc.
- the disassembler and control flow analyzer 410 can identify behaviors 425 of the given application based on, for example, comparing code or application logic model with known functionality defined in or identifiable from a software development kit and/or common APIs utilized by the corresponding client device operating system as well as most or all applications compatible with the client device.
- Some examples include the Google Android software development kit, Apple iOS software development kit, Windows software development kit, among other examples.
- a platform software development kit can provide documentation, header files, libraries, commands, interfaces, etc. defining and providing access to the various platform subsystems accessible by applications compatible with the platform.
- a platform SDK and corresponding APIs and API calls i.e., calls to functions and routines of the API
- the semantics of commonly used APIs is represented in a program readable form along with critical information necessary to derive application behavior.
- the semantics of the platform SDK can be represented so that an example application behavior engine can use the semantic model to understand and identify the operations and behaviors of a given application using the API call.
- all of the potential API calls of the platform can be represented, for instance through API intelligence 430 , by tagging the name of each respective API call with the behavioral tag describing what the respective API call does on the platform as well as the corresponding parameters of the API's operations and behaviors.
- a template of such a semantic representation can be modeled, for instance, as:
- a “category” can designate the type of an API call and be used to identify the general functionality of such API calls, such as, that the API call reads information from a particular subsystem, disk, etc. generates various messages, initiates various network behaviors, attempts to communicate with various outside servers, triggers particular device functions or elements (e.g., a camera, SMS controller, etc.).
- “Sensitivity” can represent the respective sensitivity of the subsystem affected or associated by the API in the context of the potential for malicious behavior in connection with the subsystem, such as whether reading to a particular memory location introduces the potential for spying, where the subsystem potentially permits the introduction of malware, unauthorized tracking or data collection, the unauthorized or undesired reading or sending of SMS or email messages, among many other examples.
- “dependency” can represent whether the output of this API can have an impact on other parts of the program in a direct way. For instance, a sendTextMessage( ) API can be identified as having no dependency where the API simply sends an SMS message out and does not return anything, among other examples.
- GTI global threat intelligence
- an example application behavior analysis engine (e.g., 228 ) can possess functionality for identifying the control flows, operations, functionality, and behavior of a given application based, for instance, on a semantic representation of a standard platform SDK upon which compatible applications are based.
- FIG. SA representation 500 of a simplified application control now is shown for an example gaming application. While the functionality of the game may be in the main desirable, secure, and benign, deeper inspection of the code of the game application binary in comparison with the semantic representation of the platform SDK as well as ambient application intelligence for the game application, may yield identification of other functionality that is not immediately or otherwise identifiable, understood, or appreciated by users, such as the application sending SMS messages either with or without a user's explicit knowledge or permission. In another example, shown in FIG.
- inspection of a particular object of an application binary may reveal the totality of functions and control flows of the given application as well as reveal dependencies between distinct programs, program units, or applications the user may not otherwise realize, understand, or approve of.
- identified behavior heuristics can be represented externally, in some implementations, in an XML file that identifies the specific pattern of data flow and calls, from which the behavior can be identified. For instance:
- application logic can be modeled and rules can be applied to interpret the application logic and identify instructions and calls within a corresponding binary of the application that correspond with malicious, privacy infringing, policy violating, or other undesirable behaviors.
- the logical model of an application's functionality can include representation (e.g., 505 ) of the application logic through data flow structures and control flow structures, among other examples.
- a dataflow structure can represent the lifetime of data objects as they pass-through the application logic (e.g., 510 ) and onto other program units (e.g., 515 ) including external program units.
- a dataflow structure (e.g., 505 ) can be used to identify the flow of data from one part of the application program as it moves and is potentially transformed by the application logic.
- a dataflow model can be used to deduce that particular data is being leaked by the application through an Internet communication post operation, among other examples.
- control flow structures can represent the control flow of different function calls (e.g., 520 , 525 ) to identify an originating source of an application call determined to be sensitive or undesirable.
- a call by the application to send an SMS message can be traced back, for example, to a UI element of an application interacted with by user, or even an autonomous event in a background process of the application, among potentially many other examples.
- all platform subsystems can be categorized or assigned weights based on the sensitivity of the respective subsystem in the context of the potential that the subsystem could be manipulated or utilized in connection with a malicious or otherwise undesirable behavior.
- weights and sensitivities can be based on a variety of factors including, for example, the potential for an invasion of privacy, data leaks, financial sensitivity, among other examples. These factors can also form the basis of categorizations of the various subsystems of the platform.
- Such subsystems can include, for example, contact lists, photo galleries, email clients, calendars, Internet connectivity and browsing, graphics, video functionality, cameras, audio, security tools and engines, telephony, Wi-Fi capabilities, Bluetooth capabilities, data ports, battery power, touchscreens, global positioning systems, among potentially many other functionalities and subsystems including future functionality that can be integrated in mobile devices.
- a rule engine of an application behavior analysis engine can access rules, for instance, from a rule database, including rules that have been custom defined for and/or by a particular user or set of users according, for example, to preferences of the users as well as policies applicable to the users (e.g., policies of an Internet service provider, enterprise network, broadband data provider, etc.).
- the rule engine can take as a further input an application logic model (e.g., developed based on a semantic representation of a platform SDK corresponding to the application) to assess the various operations and functionality of an application as identified in application logic model.
- the rule engine can assess the various operations and functionality of an application according to rules identified as applicable to the particular instance of an application, such as an instance of an application that has been attempted to be downloaded or installed on a particular user computing device of a user associated with the identified rules.
- Application behaviors can be identified by the rule engine including application behaviors identified as violating one or more rules (e.g., rules forbidding certain behaviors or actions) and prompting, in some instances, remediation of the identified application behaviors and/or assignment of the application to one or more operation modes on the destination user device, such as a quarantine or administrative operation mode, among other examples.
- a human readable description of a behavior identified and based on a description of API semantics can be constructed.
- human relatable verbs and nouns can be associated with template messages in the semantic representation and mapped to particular human understandable descriptions of functions and operations available to the APIs.
- a human-readable summary of the behavior analysis results can be generated from the mapping and presented to a user that describes the various functionality, as well as, in some implementations, the control flow dataflow of the analyzed application. Such results can make use of the human readable description to generate a description of the functionality uncovered during analysis of the application, including functionality that may otherwise be invisible to or difficult to detect by the user.
- the template can be utilized and populated so as to identify and describe an example application's functionality for reading SMS data from the user's device.
- corresponding description could be generated such as: “This application reads your SMS data from SMS inbox and sends to a web site.”
- Such a description could be constructed, for example, by filling in an example template based on the semantic representation of the platform SDK and APIs such as: “This application ⁇ verb: reads> your ⁇ noun:SMS data> from ⁇ noun: SMS inbox> and ⁇ verb: sends> to a ⁇ noun: website>”, among other examples.
- the analyzed application behavior can reveal the use of other applications, programs, or services by the analyzed application.
- a call to a local application, remote service, or other program by the analyzed application may be undesirable, for instance, when the other called application is identified as unsecure, un-trusted, or unknown, among other examples.
- a program called or used by the analyzed application may be identified as a trusted program.
- an application behavior analysis engine can make use of, generate, modify, and otherwise manage whitelists and/or blacklists that identify the status and reputations of various programs that have been known to or could be potentially called by various analyzed applications.
- applications and services hosted by remote servers can additionally be identified in such whitelists and/or blacklists by the respective URLs or other address information corresponding to their respective host servers, among other examples.
- the behavioral analysis engine can identify the context in which a particular activity is performed, platform API is accessed, or functionality is employed by the application under assessment.
- an analyzed application's attempts to access a platform telephony subsystem can be assessed based upon the cause or context of the attempt. For instance, in some contexts, a particular API call may be perfectly acceptable while in other contexts the API call can be undesirable.
- identified application functionality that accesses the telephony subsystem in response to a user interface interaction, such as a button press may be assessed differently than an attempt by an application to access the telephony subsystem autonomously and not in response to a user provided directive, among other examples.
- rules can be defined that can be used in the assessment of application behaviors.
- Such rules can be represented and configured for use in performing heuristic analysis of an application's logic or of a potentially malicious behavior identified by an application behavior analysis engine, including contexts in which the behavior is to be determined to be malicious.
- a rule engine can apply one or more rules to an application logic model to identify one a more potentially malicious or otherwise undesirable behaviors present in the application.
- a rule can be represented as:
- ⁇ Rule> ⁇ Run> ⁇ Dataflow> ⁇ ReadOperation>of ⁇ red sub system>to a ⁇ WriteOperation> of ⁇ write sub system>
- the rules can be generic or can be specific to a particular subsystem, etc., such as a rule to detect data leak of a memory element storing personal contact data, among other examples.
- a specific application behavior can be derived based on application of a single rule or multiple rules.
- an application behavior analysis engine can be hosted on one or more server computing devices remote from the mobile user devices for which analysis performed.
- at least a portion of application behavior analysis engine can be provided alternatively or redundantly with functionality of server-side application behavior analysis engine components.
- a user computing device can be provided with application behavior analysis engine functionality allowing at least a partial or quick preliminary assessment of an application to be performed at the user device to thereby provide a user with fast feedback as well as assess whether an application should be quarantined, denied download or installation, and/or forwarded to a remote application behavior analysis engine, such as one provided in a cloud system, allowing then for a more robust behavioral analysis of the application (that could possibly introduce increased latency into the behavioral analysis assessment).
- a user can be provided with a prompt identifying the analysis of the application as well as providing the user with various options for dealing with the installation, downloading, or launching of the analyzed application. For instance, a user may be provided with the option of skipping the analysis, delaying installation of the analyzed application, assigning the analyzed application to a particular mode, among other examples.
- a prompt presented to the user in connection with the assessment may be presented together with information, such as preliminary information, gleaned from the behavioral analysis engine assessments and/or external intelligence relating to the analyzed application.
- Such intelligence can include, for example, intelligence gleaned by the behavioral analysis engine in previous assessments of the analyzed application, among other examples.
- the behavioral analysis engine can indicate to the user behaviors discovered for the application, how other users have responded to feedback received from the behavioral analysis engine regarding the particular analyzed application, among other examples.
- behavioral analysis engine can maintain blacklists, greylists, and/or whitelists of applications known to and/or previously analyzed by the behavioral analysis engine. Such blacklists, greylists, and/or whitelists can be based on historical intelligence collected from previous behavioral analyses, outside intelligence from other sources, and other users.
- the behavioral analysis engine can utilize such information to perform an initial assessment of an application and leverage information gleaned from previous analyses. Initial filtering or feedback can thereby be provided to a user to assist the user in determining how to deal with a particular application as well as whether to initiate further behavioral analysis on the application using the behavioral analysis engine.
- Behavioral analysis of applications and/or blacklists/whitelists can further incorporate or consider general reputation information of developers or other parties identified as responsible for various applications, among other examples and considerations. Rules can be defined that consider the trustworthiness or untrustworthiness of the developer, distributor, etc. of an application.
- AppDeveloper Rating f(total number of apps, weighted average of undesired behavior in apps, popularity of the app, average ratio of low ratings), among other examples.
- a weighted average of undesired behavior can be generated for a set of applications of a developer:
- intelligence databases such as a global threat intelligence (GTI) feed
- GTI global threat intelligence
- a behavioral analysis engine can interface with intelligence databases to provide additional intelligence gleaned from the behavioral analyses of applications performed by the behavioral analysis engine itself, among other examples.
- applications offered by one or more application servers or storefronts may provide users with basic descriptions, ratings, user feedback, etc. collected for a given application.
- ratings, application descriptions, content ratings, etc. may be provided by, manipulated by, or otherwise influenced by the application developers themselves thereby diminishing, potentially, the truthfulness or legitimacy of the information provided to users regarding some applications.
- intelligence e.g., behavioral descriptions
- behavioral analysis engine may be used to supplement, correct, or otherwise modify descriptions provided to users in connection with their browsing, purchasing, and downloading of applications available on a platform.
- a behavioral analysis engine can make use of these default application descriptions, content ratings, user feedback etc. as external intelligence considered in connection with a behavioral analysis.
- a behavioral analysis engine may be used to identify common behavioral traits between multiple applications that can serve as the basis for categorizing the applications according to behavior. Such categories can then be provided to users to assist users in better understanding the qualities and behaviors, as well as potential risks, of various applications, among other examples.
- FIG. 8 a simplified schematic diagram 800 is shown of an example flow for performing deep analysis of application behavior (e.g., using a behavioral analysis engine) and performing application healing in an attempt to remedy those behaviors determined to be undesirable in an application while still preserving other core functionality of the application, in some examples.
- application binaries can be submitted to a disassembler and data control flow analyzer 410 (e.g., of a behavior analysis engine) to develop application logic models (e.g., 420 ) based, in some examples, additionally on ambient application knowledge 415 , intelligence, and the like.
- the model of application logic 420 can be assessed based on defined rules, platform API intelligence, and behavioral heuristics through a behavioral heuristics/rules engine 435 to identify application behaviors of a respective application. Further, sections of code of the application can be identified during the assessment as responsible for the exhibited undesirable behavior. This code can be flagged for remediation. Additionally, in instances where application behaviors are identified as undesirable and are requested or dictated, by a user, administrator, or predefined rules, to be healed, the application binaries can be further processed to remove, block, or otherwise remediate the offending behaviors and corresponding code to thereby generate healed versions 232 of the application binaries that a user can then cause to be downloaded, installed, and executed on the user's device.
- the global threat intelligence feed 440 or other intelligence database can provide intelligence for consideration and behavioral analyses as well as application healing. Additionally, intelligence gleaned from the behavioral analyses can be shared with outside intelligence databases that additionally receive input, data, and intelligence from a community of users and systems 445 .
- rules and policies can be defined, for instance, by a user or system or network administrator, to define how and under what conditions applications are to be handled that have been determined to include one or more undesirable behaviors.
- policies can, for example, identify particular types of undesirable behaviors and map such behaviors to predefined courses of action, such as the healing or remediation of the applications, blacklisting or whitelisting of the applications, quarantining of the applications, among other examples.
- user inputs can drive management of an application's deployment on a user computing device. Such inputs can be received in connection with prompts presented to the user and can include, for example, requests to remediate one or more identified undesirable behaviors, instructions to assign the analyzed application to a particular operation mode or quarantine area, among other examples.
- FIG. 10 simplified b diagram 1000 is illustrated showing the flow of an example healing of an original application 1005 .
- a healing engine can be provided for identifying, removing, replacing, or blocking, the offending code and corresponding behaviors in order to generate a modified application binary 1015 .
- a healing engine 228 may include logic for modifying an application by removing or blocking various types of undesired behaviors such as, in this example, unauthorized reads or accesses of SMS functionality by removing the offending instructions discovered in the original application binary.
- a healing engine may modify the offending code, such as by rewriting the code to redirect an API call to a trusted system, destination, address, etc.
- a healing engine 228 can modify the original code with minimal changes so as to avoid affecting the core desired functionality of the application.
- healing policies can identify the patterns that are considered for identifying application code for healing. This can be represented, for example, in an XML file that identifies the heuristic pattern of code corresponding to an offending behavior. Each type of defined or identified pattern of code can be healed by a specific healing method, such as according to corresponding policies. Such methods can be identified and defined in such a way that the healing does not impact the rest of the application's functionality.
- a variety of healing methods can be employed by an application healer engine. For instance, a particular offending line of code functionality can be identified as a final or leaf node in a control chain. In such instances, the offending code may be determined to be able to be suppressed or removed without affecting other dependencies in the application, among other examples. In another example, if a removal of a particular API call is determined to likely have no impact on surrounding code, the removal healing method can be applied. The nature and character of APIs can be learned, for example, from the semantic platform SDK representation, among other examples.
- the offending behavior can be from one or more sections of code and may result in multiple methods of healing applied to remediate the behavior, such as by replacing the data in a register to alter the behavior of the API or redirecting of the API call to a new version of the API with same interface by replacing the offending API code with the new API code, among other examples.
- the new API may, for example, do nothing and set the register status so as not to impact other parts of the program, process the inputs in a different way to avoid the undesired behavior, or do pre-processing and/or post-processing of the input/output parameter and call the original API, among other example techniques that resolve the undesirable behavior.
- FIG. 11 a simplified block diagram is illustrated showing the identification of code relating to particular undesirable behaviors.
- sections 1 a and 1 b of application code can be identified as corresponding to a first, detected, undesirable behavior and sections 2 a and 2 b can be identified as corresponding to a second undesirable behavior of the application.
- healing the application can include modifying or replacing the identified offending sections of code with code that modifies or suppresses the undesirable behaviors.
- healing policies can be identified corresponding to the identified code or API calls to identify healing techniques for modifying the offended code and remediating the undesired behaviors.
- FIGS. 12A-12E additional examples are illustrated of the detection of undesirable behaviors as well as the remediation of the undesirable behaviors.
- FIG. 12A an example code fragment allowing an application to send latitude and longitude information to an outside server is shown as having been processed to populate an API template, for instance, utilizing a behavior analysis engine.
- FIG. 12B portions of the application code can be identified that correspond to the behavior of collecting geo-positional data and sending the geo-positional data to the outside server.
- the offending lines of code can be replaced, for example with code that masks or redirects the sending of the geo-positional data to prevent the application from tracking user location, among other examples.
- FIG. 12C a control flow can be identified within an application along with corresponding application code.
- remediation of a particular undesirable behavior can include deletion of an offending line of code, among other examples.
- FIG. 13 illustrates an example flow 1300 in connection with remediation of one or more detected undesirable behaviors of an application.
- the connection with the dynamic personalization of an application's behavior for particular user the composite behaviors of the application and corresponding code segments can be identified.
- a user interface can be presented in connection with the healing or customization of the application allowing the user to select particular identified behaviors for remediation or modifications.
- the user interface can be provided in connection with an application healing engine with the user inputs directing how (e.g., which identified behaviors) the application healing engine is to modify the application.
- application healing engine can insert one or more user interface controls into the original binary of the application allowing the user at launch of the modified application to dynamically enable, disable, or otherwise remediate or customize the behavior of the application.
- an original section of the code corresponding to an accepted behavior can be utilized in lieu of a healed version of the same code, among other examples.
- each of the segments of the code where behavior is demonstrated can be selectively turned off or on based on the user preferences and inputs.
- the user interface can provide a user with the option of saving the settings of an application so that the selection of a particular subset of application behaviors persists and is available the next time the application is launched on the user's device.
- functionality can be provided to define, enable, and employ defined usage modes on the user devices.
- user devices such as smart phones and tablet computers, among other examples, are designed to support a single user and application profile.
- a single operation profile and mode may not be appropriate for all of the actual users of the device or the situations in which the device is used. For instance, a user may desire to loan their device to a friend for some short period of time, but would like to nonetheless retain control of the access to some of the sensitive applications and data on the device, email applications, contacts, calendars, messaging functionality, etc.
- the user may desire to allow a child to temporarily use the device, for example, to play game, but would prefer for other applications (e.g., web browsers) and access to certain device settings and data to be blocked from the child.
- other applications e.g., web browsers
- users may desire to control usage of some subset of the applications on the device to specific times, locations, and situations. For instance, games and social networking applications may be desired to be disabled during school hours, among other examples.
- FIG. 14A illustrates a simplified block diagram 1400 a of an example implementation of a mode manager.
- various modes may be defined based on intelligence gleaned from the user device as well as outside services.
- a user may define one or more modes through a user interface and a mode manager, for instance, on the device may manage access to the various modes, for example, using dedicated credentials assigned to each of the modes.
- an application monitoring service or application behavioral analysis engine may recommend particular applications for a quarantine or high-security mode available on the user device. Accordingly, a user may define such modes to restrict access to potentially risky or currently analyzed applications to administrative, adult, or other trusted users, among other examples.
- FIG. 14B illustrates another simplified block diagram 1400 b illustrating principles of an application mode manager.
- An application mode manager 248 may include various modules and functionality such as a mode setup manager 1405 , lock service 1410 , lock manager 1415 , credential manager 1420 , application access manager 1425 , application protection service 1430 , password engine 1435 , among other examples.
- the user with administrative privileges can set up passwords or PINs and assign these credentials to modes defined by the user, for instance, using a mode setup manager.
- An access manager can utilize a credential manager to verify whether valid credentials have been received that allow a current user of the device to access one of a set of modes defined for the device. In the event that incorrect credentials are entered, a lock manager can invoke a lock service to lock out the current user from one or more applications by assigning the user to a restricted mode or locking out the user altogether.
- a device mode can be composed of an exclusion list or inclusion list.
- Device modes can be defined as respective sets of applications that are either allowed or somehow protected in that mode, in the sense their usage is prohibited or limited.
- an exclusion list can be defined for a mode that indicates a particular subset of the applications and/or subsystems of a device that are accessible under the corresponding mode (i.e., with the remaining applications protected or locked in that mode).
- a mode can be defined such as according to: ⁇ ModeName, inclusion/Exclusion, Access PIN, App1, App2, App3 . . . App N>.
- each device mode can be protected and associated with a particular password.
- the master mode can be defined that allows access to the entirety of the device's functionality and applications. Accordingly, a master password can be provided that enables access to the master mode.
- the user may be provided with access to a management console for managing the set of modes available or defined at the device. Accordingly the user may edit or define modes through the management console, as well as activate or delete predefined modes.
- An example management console can allow a user to select, from a listing of applications, those applications the user wishes to designate as protected or accessible in any given mode. In some cases, a single application can be allowed or protected under multiple different modes.
- mode passwords may be stored in encrypted memory.
- the password of each mode can be encrypted using a key generated by the same password.
- a stored, encrypted password can then be validated by decrypting the password with a key generated from the password entered by the user.
- the decrypted data can then be compared with the user-entered password.
- a corresponding mode can be identified and authenticated to allow access to the mode by the user.
- the user may manually lock the device or the device may lock itself, for instance, after a prolonged period of inactivity. When attempting to unlock the device or wake up the device a user may be again presented with a login prompt requesting a password of one of the modes available and defined for the device.
- modes can be hierarchical. For instance, a user logged into a higher level mode (i.e., a mode providing a relatively greater level of access), may be able to freely move to another mode without providing credentials for that lower-level mode. On the other hand, a user who has been authenticated to a lower level mode may be forced to enter additional credentials when attempting to access another mode at a higher level in the hierarchy than the lower-level mode to which the user was previously authenticated. For example, in one instance, four device modes can be defined where:
- Mode1 is admin level mode
- Mode 2 guest level mode
- Mode 3 is guest level mode
- Mode 4 is low privilege mode
- Mode1>(Mode2 and Mode 3)>Mode 4 where Mode2 is the same level as Mode3, among other example implementations.
- configuration of the device can be altered, customized, or at least partially restricted when certain modes are active.
- a particular mode can activate or deactivate GPS functionality, data access, telephony, as well as certain applications.
- device modes can be provided that secure data of particular applications when mode. For instance, once a new mode has been created and assigned a corresponding access level to set of applications, the data of these applications may be protected by encryption through a separate encryption key. This can be implemented for example by using an encrypting file system for encrypting files and folders, among other examples.
- the executable code of applications can be secured to protect against applications being used in modes that disallow access and/or use to one or more of the behaviors or features of the application.
- the application executable can be stored in encrypted secondary storage.
- An operating system loader of the user device can gain conditional unencrypted access to the executable code, in some examples, only if the application is found in an allowed application list for the active device mode in which access to the application is attempted, among other potential implementations.
- defining multiple device modes for a user device can further result in the provision of multiple unique home screens to be presented in each of the corresponding modes.
- the appearance of a given home screen can indicate to a user the mode that is active on the device as well as access privileges available in that mode.
- home screens can include icons of applications that are available within that corresponding mode, hiding or obscuring the icons of other applications that are protected within that mode, among other examples.
- device modes can be created automatically, for instance, based on identified behaviors and security profiles of applications that are detected or loaded on the user device.
- a mode manager can make use of behavioral analyses performed, for example, by an example application behavioral analysis engine, to identify applications that exhibit a common category of behaviors or category of security profiles. For instance, applications identified as permitting access to online resources may be grouped and assigned dynamically to one or more modes that have been defined as allowing such access. Other modes, such as modes dedicated for underage users, may be denied access to applications that allow users to access the Internet, among other examples.
- Other example categories may include applications that enable telephony or mobile messaging functionality, applications that make use of subsystems that utilize sensitive data, collect potentially private information (e.g., cameras, voice recorders, GPS systems, etc.), and other examples.
- ambient intelligence relating to an application such as an age rating (e.g., 7+, 12+, 18+ years, etc.), user reviews, or other information may be used to categorize applications and group them in various modes.
- an age rating e.g., 7+, 12+, 18+ years, etc.
- user reviews, or other information may be used to categorize applications and group them in various modes.
- a description of an application may include an age or maturity rating as well as reasons for the maturity rating. Accordingly, in one example, one or more modes may be defined, for example, that block access by child users to applications with higher maturity ratings, among other examples.
- application information can be constructed from security information regarding behaviors of an application from global threat intelligence 440 , publisher/developer reputation information 1905 , app store feedback and reviews 1910 , behavior analysis results 1915 , among other examples.
- Such information e.g., 440 , 1905 , 1910 , etc.
- behavioral assessments 1915 of the applications e.g., whether an application potentially leaks data, provides location information, enables SMS messaging, etc.
- a user may further designate custom categories or behaviors or select pre-defined categories or behaviors as the basis for assignments of applications to respective modes rather than individually selecting the applications for inclusion in one or more modes on al a carte basis, among other examples.
- FIG. 15A an example algorithm is represented for the storing of password information associated with a particular mode.
- FIG. 15B represents an example algorithm for validating a password and identifying a mode to activate that corresponds to the entered password. It should be appreciated that the algorithms of FIGS. 15A-15B are non-limiting examples presented merely for purposes of illustration and that other alternative algorithms and implementations can be utilized in other instances.
- modes defined by a given user may be provided, for instance, to an application management service, cloud service or other service (e.g., 1600 ) that allows one or more modes, as well as rules associated with the modes, to be aggregated and shared with other users.
- shared device modes maintained by a mode sharing service 1600 can be browsed and selected for download and utilization on user devices 110 , 120 , allowing a user to provision their own device with modes created by other users and shared using the mode sharing service.
- the user can provision the shared mode, in some examples, by downloading and installing a definition of the shared mode from the mode sharing service and assigning a unique password to the newly installed mode.
- mode configurations can be shared directly between devices, with one device obtaining a new mode from another device sharing the mode, for instance, through wireless peer-to-peer technologies like Bluetooth, near field communications (NFC), WiFi, and others.
- modes can be activated automatically based on context information detected, for example, by the device itself.
- a user in some examples, can configure (e.g., on the management console), rules for automatically activating particular modes. For instance, a particular mode can be activated automatically in response to the detection of a specific context at the user device.
- Such contexts can include, for example, detecting the location or proximity of the device within a defined geo-fence, detecting that the device is in proximity of other devices, detecting the device in range of particular data networks, detecting a user of the device (e.g., based on user biometric information collected by the device), a detected time of day, device battery status, usage activity (e.g., to guard against particular users spending too much time on the device, etc.), whether the device is traveling or in motion (e.g., as detected through GPS functionality, accelerometers, or other functionality on the device), among potentially many other examples.
- modes can be provisioned and configured through a remote service, such as a cloud service, allowing a user to activate/deactivate or define a mode remotely.
- a remote service such as a cloud service
- a user can create a mode remotely (e.g., using a computer other than the target mobile user device) and provision one or more modes to the target user device and also activate and deactivate the mode on the user device from a remote location.
- an administrator can also use the service to provision such modes on mobile user devices as well as define rules and contexts for automatically activating, applying, or deactivating a given mode, among other examples.
- FIGS. 20A-20D illustrate example screenshots of user interfaces showing particular features of some example implementations of mode management on a mobile user device.
- screenshot in FIG. 20A illustrates a user interface for defining a new mode and mode password.
- a similar user interface can be provided to allow a user to select and activate one of multiple available modes on the device and/or provide credentials for the selected mode.
- a user device may include native login credentials or a native login manager.
- a mode manager may be implemented as an application itself that overrides a native login manager and replaces a native login screen with the mode-specific login prompts (e.g., that allow the multi-mode functionality of the user device).
- a user may not be able to visually distinguish that a user device is provisioned with multiple modes, with the login screen capable of accepting one of a plurality of different login codes, each login code corresponding to a supported mode (including hidden modes) provisioned on the user device.
- the screenshot of FIG. 208 illustrates a view of a home screen for a particular mode.
- a set of restricted applications can be designated that can only be accessed by providing credentials to and activating a higher level mode (e.g., that permits access of the restricted applications).
- a My Apps folder can provide access to those applications that have been enabled in a current active mode.
- Screenshot of FIG. 20C provides another view of an example administrative screen that permits users to activate, edit, or create new modes.
- example screenshot of FIG. 20D illustrates a user interface that can be provided in some implementations of a mode manager allowing a user to designate from a list of applications on the device which applications are to be included or protected in a given mode, and so on.
- FIGS. 21A-21C are flowcharts 2100 a - c illustrating example techniques in the management of applications on mobile user computing devices.
- code of a particular application can be analyzed 2105 , for instance, against a semantic representation of a platform, such as a representation of a platform SDK and/or APIs.
- a set of behaviors of the particular application can be identified 2110 .
- At least one undesirable behavior in the set of behaviors can be identified 2115 , for instance, based on the user selection of one of the identified set of behaviors or automatically according to rules and/or policies defined (e.g., by a user or administrator) for applications to be downloaded, installed, launched, or otherwise used at a particular mobile computing device.
- a behavior can be identified 2120 and a set of behaviors detected for a particular application (e.g., according to the principles of the example of FIG. 21A ).
- a section of code of the particular application can then be identified 2125 corresponding to the identified behavior.
- a remediation action can be performed 2130 on the identified section of code to automatically remediate the behavior, for instance, in response to an identification that the identified behavior is an undesirable behavior, etc.
- the remediation action can result in the dynamic generation of a “healed” version of the particular application that retains at least a portion of its original functionality, with the undesired functionality being blocked or stripped from the healed version.
- a particular one of a plurality of modes can be activated 2140 .
- the modes can be defined for a particular user computing device and dictate what subset of the functionality of the computing device and its software may be accessible to a particular user having credentials for accessing a respective mode in the plurality of modes.
- Access can be restricted 2145 to one or more applications installed on the user computing device according to the activation 2140 of the particular mode.
- activation of the particular mode can result in a restricted or alternate configuration of the computing device to be applied that thereby limits a user's access to one or more subsystems and functionality, including hardware functionality, and settings and data of the user computing device, among other examples.
- Embodiments of the subject matter and the operations described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them.
- Embodiments of the subject matter described in this specification can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions, encoded on computer storage medium for execution by, or to control the operation of, data processing apparatus.
- the program instructions can be encoded on an artificially generated propagated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal that is generated to encode information for transmission to suitable receiver apparatus for execution by a data processing apparatus.
- a computer storage medium can be, or be included in, a computer-readable storage device, a computer-readable storage substrate, a random or serial access memory array or device, or a combination of one or more of them.
- a computer storage medium is not a propagated signal per se, a computer storage medium can be a source or destination of computer program instructions encoded in an artificially generated propagated signal.
- the computer storage medium can also be, or be included in, one or more separate physical components or media (e.g., multiple CDs, disks, or other storage devices), including a distributed software environment or cloud computing environment.
- Networks can include one or more network elements.
- Network elements can encompass various types of routers, switches, gateways, bridges, load balancers, firewalls, servers, inline service nodes, proxies, processors, modules, or any other suitable device, component, element, or object operable to exchange information in a network environment.
- a network element may include appropriate processors, memory elements, hardware and/or software to support (or otherwise execute) the activities associated with using a processor for screen management functionalities, as outlined herein.
- the network element may include any suitable components, modules, interfaces, or objects that facilitate the operations thereof. This may be inclusive of appropriate algorithms and communication protocols that allow for the effective exchange of data or information.
- the operations described in this specification can be implemented as operations performed by a data processing apparatus on data stored on one or more computer-readable storage devices or received from other sources.
- data processing apparatus processor
- processing device computing device
- the terms “data processing apparatus,” “processor,” “processing device,” and “computing device” can encompass all kinds of apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, a system on a chip, or multiple ones, or combinations, of the foregoing.
- the apparatus can include general or special purpose logic circuitry, e.g., a central processing unit (CPU), a blade, an application specific integrated circuit (ASIC), or a field-programmable gate array (FPGA), among other suitable options.
- CPU central processing unit
- ASIC application specific integrated circuit
- FPGA field-programmable gate array
- processors and computing devices have been described and/or illustrated as a single processor, multiple processors may be used according to the particular needs of the associated server. References to a single processor are meant to include multiple processors where applicable.
- the processor executes instructions and manipulates data to perform certain operations.
- An apparatus can also include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, a cross-platform runtime environment, a virtual machine, or a combination of one or more of them.
- the apparatus and execution environment can realize various different computing model infrastructures, such as web services, distributed computing and grid computing infrastructures.
- a computer program (also known as a program, software, software application, script, module, (software) tools, (software) engines, or code) can be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages, and it can be deployed in any form, including as a standalone program or as a module, component, subroutine, object, or other unit suitable for use in a computing environment.
- a computer program may include computer-readable instructions, firmware, wired or programmed hardware, or any combination thereof on a tangible medium operable when executed to perform at least the processes and operations described herein.
- a computer program may, but need not, correspond to a file in a file system.
- a program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code).
- a computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
- Programs can be implemented as individual modules that implement the various features and functionality through various objects, methods, or other processes, or may instead include a number of sub-modules, third party services, components, libraries, and such, as appropriate. Conversely, the features and functionality of various components can be combined into single components as appropriate.
- programs and software systems may be implemented as a composite hosted application. For example, portions of the composite application may be implemented as Enterprise Java Beans (EJBs) or design-time components may have the ability to generate run-time implementations into different platforms, such as J2EE (Java 2 Platform, Enterprise Edition), ABAP (Advanced Business Application Programming) objects, or Microsoft's .NET, among others.
- EJBs Enterprise Java Beans
- design-time components may have the ability to generate run-time implementations into different platforms, such as J2EE (Java 2 Platform, Enterprise Edition), ABAP (Advanced Business Application Programming) objects, or Microsoft's .NET, among others.
- applications may represent web-based applications accessed and executed via a network (e.g., through the Internet).
- one or more processes associated with a particular hosted application or service may be stored, referenced, or executed remotely.
- a portion of a particular hosted application or service may be a web service associated with the application that is remotely called, while another portion of the hosted application may be an interface object or agent bundled for processing at a remote client.
- any or all of the hosted applications and software service may be a child or sub-module of another software module or enterprise application (not illustrated) without departing from the scope of this disclosure.
- portions of a hosted application can be executed by a user working directly at a server hosting the application, as well as remotely at a client.
- the processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform actions by operating on input data and generating output.
- the processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
- processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer.
- a processor will receive instructions and data from a read only memory or a random access memory or both.
- the essential elements of a computer are a processor for performing actions in accordance with instructions and one or more memory devices for storing instructions and data.
- a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks.
- mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks.
- a computer need not have such devices.
- a computer can be embedded in another device, e.g., a mobile telephone, a personal digital assistant (PDA), tablet computer, a mobile audio or video player, a game console, a Global Positioning System (GPS) receiver, or a portable storage device (e.g., a universal serial bus (USB) flash drive), to name just a few.
- Devices suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto optical disks; and CD ROM and DVD-ROM disks.
- the processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
- a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer.
- a display device e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor
- keyboard and a pointing device e.g., a mouse or a trackball
- Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input.
- a computer can interact with a user by sending documents to and receiving documents from a device, including remote devices, which are used by the user.
- Embodiments of the subject matter described in this specification can be implemented in a computing system that includes a back end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the subject matter described in this specification, or any combination of one or more such back end, middleware, or front end components.
- the components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network.
- Examples of communication networks include any internal or external network, networks, sub-network, or combination thereof operable to facilitate communications between various computing components in a system.
- a network may communicate, for example, Internet Protocol (IP) packets, Frame Relay frames, Asynchronous Transfer Mode (ATM) cells, voice, video, data, and other suitable information between network addresses.
- IP Internet Protocol
- ATM Asynchronous Transfer Mode
- the network may also include one or more local area networks (LANs), radio access networks (RANs), metropolitan area networks (MANs), wide area networks (WANs), all or a portion of the Internet, peer-to-peer networks (e.g., ad hoc peer-to-peer networks), and/or any other communication system or systems at one or more locations.
- LANs local area networks
- RANs radio access networks
- MANs metropolitan area networks
- WANs wide area networks
- peer-to-peer networks e.g., ad hoc peer-to-peer networks
- the computing system can include clients and servers.
- a client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
- a server transmits data (e.g., an HTML page) to a client device (e.g., for purposes of displaying data to and receiving user input from a user interacting with the client device).
- client device e.g., for purposes of displaying data to and receiving user input from a user interacting with the client device.
- Data generated at the client device e.g., a result of the user interaction
- One or more embodiments may provide an apparatus, a system, a machine readable medium, and a method to analyze code of a particular application against a semantic model of a software development kit of a particular platform, identify, based on the analysis of the code, a set of behaviors of the particular application, and identify that one or more of the set of behaviors are undesired behaviors.
- the semantic model can associate potential application behaviors with one or more of APIs of the particular platform.
- identifying that one or more of the set of behaviors are undesired behaviors includes determining that the one or more behaviors violate one or more rules.
- the rules can be associated with a particular user.
- a user input identifies one or more of the set of behaviors as undesirable.
- the user input can be received in connection with a user interface displaying human readable descriptions of the identified set of behaviors.
- code of the particular application can be disassembled into a control flow and a model of application logic for the particular application can be generated based at least in part on the semantic model.
- the model of application logic can be further based, at least in part, on ambient application knowledge.
- a remediation action can be performed based on the identification that one or more of the set of behaviors are undesired behaviors.
- the code of the particular application is analyzed in connection with an attempt to implement the particular application on a particular user device.
- One or more embodiments may provide an apparatus, a system, a machine readable medium, and a method to identify a particular behavior in a set of behaviors detected as included in a particular application, identify a section of code of the particular application corresponding to the particular behavior, and perform a remediation action on the section of code to remediate the particular behavior and generate a healed version of the particular application.
- the remediation action preserves other behaviors of the particular application other than the particular behavior.
- the remediation action includes deleting the section of code.
- the remediation action includes rewriting the section of code.
- the remediation action includes adding additional code to the application to nullify the particular behavior.
- the remediation action is identified from a policy identifying a remediation pattern determined to be applicable to remedying the particular behavior.
- the remediation action includes inserting application logic allowing a user to selectively enable a healed version of the particular behavior at launch of the healed application on a user device.
- the user can be further allowed to selectively enable an unhealed version of the particular behavior in lieu of the healed version.
- the set of behaviors of the particular application can be detected through an analysis of code of the particular application.
- the remediation action is triggered by a user request.
- One or more embodiments may provide an apparatus, a system, a machine readable medium, and a method to activate a particular one of a plurality of modes defined for a particular user device, and restrict access to one or more applications installed on the particular user device in accordance with the activated particular mode.
- the restricted applications can be accessible when another one of the plurality of modes is activated.
- the particular mode is activated in response to a particular passcode entered by a user of the particular user device, where each of the plurality of modes is associated with a corresponding passcode.
- Activation of the particular mode can include identifying the particular mode from the plurality of modes based on the entry of the particular passcode, and authenticating access to the particular mode based on the entry of the particular passcode.
- one or more of the plurality of modes are user-defined modes.
- an alternate device configuration can be applied to the particular user device based on activation of the particular mode.
- the alternate device configuration can restrict access to one or more subsystems of the particular user device.
- one of the plurality of modes is an administrative modes allowing for modification of the plurality of modes.
- At least one of the plurality of modes is an instance of a mode downloadable from a mode sharing service remote from the particular user device.
- the particular mode is activated automatically based at least in part on the detection of a particular context using functionality of the particular user device.
- At least a particular one of the applications is restricted based on a defined rule for the particular mode.
- the defined rule pertains to detected behavior of the particular application.
- the plurality of modes includes a mode designated as a quarantine mode for application awaiting behavioral analysis or remediation.
- the particular mode is activated in response to a user command received at a device remote from the particular user device.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Computational Linguistics (AREA)
- Databases & Information Systems (AREA)
- Bioethics (AREA)
- Virology (AREA)
- Stored Programmes (AREA)
- Information Transfer Between Computers (AREA)
Abstract
Description
- This disclosure relates in general to the field of computer security and, more particularly, to security of mobile devices.
- The distribution and use of mobile devices, such as smart phones, PDAs, laptops, netbooks, and tablets have grown at a rapid pace. Further, adoption of such devices is also expanding and number overtaking that of desktop computers and feature phones in some developed markets. The sophistication of the operating systems and the hardware capabilities of mobile devices is also increasing and, in some cases, outpacing the features sets and functionality of traditional computers. For example, modem mobile devices can possess such varied sensors and subsystems as location sensors like global positioning systems (GPS), accelerometers, gyroscopes, near field communication (NFC), etc. that are ordinarily not included on traditional devices. Adding to this the always connected nature of some mobile devices and the tendency for their owners to constantly carry the devices, mobile devices have become attractive targets for malware developers, hackers, and other malicious actors. Further, “app stores” and other open marketplaces have enabled the development of tens of thousands of applications (or “apps”) that have been developed for such devices, including device platforms such as Google Android™, iOS™, Windows™, etc., with some of these applications being of questionable quality and purpose.
-
FIG. 1 is a simplified schematic diagram of an example system including an application management system in accordance with one embodiment; -
FIG. 2 is a simplified block diagram of an example system including an example application manager and user device in accordance with one embodiment; -
FIG. 3 is a simplified block diagram representing analysis and healing of an application for a user device in accordance with one embodiment; -
FIG. 4 is a simplified block diagram representing an example behavioral assessment of an application in accordance with one embodiment; -
FIGS. 5A-5B are simplified representation of control flow within example applications in accordance with some embodiments; -
FIG. 6 is a simplified block diagram representing example subsystems accessible to an example user device in accordance with some embodiments; -
FIG. 7 is a simplified block diagram representing use of rules to determine application behaviors in accordance with some embodiments; -
FIG. 8 is a simplified flow diagram representing assessment of application behaviors and healing of undesired behaviors in accordance with one embodiment; -
FIG. 9 is a simplified flow diagram representing decisions made in connection with the management and remediation of applications determined to include undesirable behaviors based on behavioral analyses of the applications in accordance with one embodiment; -
FIG. 10 is a simplified flow diagram representing an example healing of an application in accordance with one embodiment; -
FIG. 11 is a simplified block diagram representing an example healing of an application in accordance with one embodiment; -
FIGS. 12A-12E represent examples of detection and remediation of undesired behaviors of an application in accordance with some embodiments; -
FIG. 13 is a simplified flow diagram representing an example healing of an application in accordance with one embodiment; -
FIGS. 14A-14B are simplified block diagram representing features of an example mode manager in accordance with some embodiments; -
FIGS. 15A-15B represent portions of example algorithms for managing modes in a user device in accordance with some embodiments; -
FIG. 16 is a simplified block diagram for sharing device modes between devices in accordance with one embodiment; -
FIG. 17 is a simplified flow diagram illustrating use of context in managing modes of a device in accordance with one embodiment; -
FIG. 18 is a simplified flow diagram illustrating remote provisioning and/or activation of modes on a user device in accordance with some embodiments; -
FIG. 19 is a simplified block diagram representing application information collected in accordance with some embodiments; -
FIGS. 20A-20D are screenshots of example user interfaces provided in connection with mode management of a user device in accordance with some embodiments; -
FIGS. 21A-21C are flowcharts representing example operations involving an example application management system in accordance with some embodiments. - Like reference numbers and designations in the various drawings indicate like elements.
-
FIG. 1 illustrates anexample system 100 including, for instance, an exampleapplication management server 105, and one or moremobile user devices Application management server 105 can provide one or more services to the user devices to assist in the management of applications downloaded, installed, used, or otherwise provided for theuser devices User devices application servers 140, such as centralized application storefronts, such as, for example, Android Market™, iTunes™, and other examples.Application servers 140 can further include, in some examples, other sources of software applications that can be downloaded and installed onuser devices User devices application management server 105 over one ormore networks 130, including local area networks and wide area networks such as the Internet. Among the services of an exampleapplication management server 105, applications available touser devices application management server 105. Further,application management server 105, in connection with services made available touser devices information servers 145. For instance,such information servers 145 can host services and data that provide additional intelligence and context regarding applications available touser devices - In general, “servers,” “clients,” “client devices,” “user devices,” “mobile devices,” “computing devices,” “network elements,” “hosts,” “system-type system entities,” and “systems,” including system devices in example computing environment 100 (e.g., 105, 110, 115, 120, 125, 140, 145, etc.), can include electronic computing devices operable to receive, transmit, process, store, or manage data and information associated with the
computing environment 100. As used in this document, the term “computer,” “processor,” “processor device,” or “processing device” is intended to encompass any suitable processing device. For example, elements shown as single devices within thecomputing environment 100 may be implemented using a plurality of computing devices and processors, such as server pools including multiple server computers. Further, any, all, or some of the computing devices may be adapted to execute any operating system, including Linux™, UNIX™, Microsoft Windows™, Apple OS™, Apple iOS™, Google Android™, Windows Server™, etc., as well as virtual machines adapted to virtualize execution of a particular operating system, including customized and proprietary operating systems. - Further, servers, user devices, network elements, systems, and other computing devices can each include one or more processors, computer-readable memory, and one or more interfaces, among other features and hardware. Servers can include any suitable software component or module, or computing device(s) capable of hosting and/or serving software applications and services (e.g., personal safety systems, services and applications of
server 105, etc.), including distributed, enterprise, or cloud-based software applications, data, and services. For instance, in some implementations, anapplication management server 105,application servers 140,information servers 145, or other subsystems ofcomputing system 100 can be comprised at least in part by cloud-implemented systems configured to remotely host, serve, or otherwise manage data, software services and applications interfacing, coordinating with, dependent on, or otherwise used by other services and devices insystem 100. In some instances, a server, system, subsystem, or computing device can be implemented as some combination of devices that can be hosted on a common computing system, server, server pool, or cloud computing environment and share computing resources, including shared memory, processors, and interfaces. - User, endpoint, or client computing devices (e.g., 110, 115, 120, 125, etc.) can include traditional and mobile computing devices, including personal computers, laptop computers, tablet computers, smartphones, personal digital assistants, feature phones, handheld video game consoles, desktop computers, internet-enabled televisions, and other devices designed to interface with human users and capable of communicating with other devices over one or more networks (e.g., 130). Computer-assisted, or “smart,” appliances can include household and industrial devices and machines that include computer processors and are controlled, monitored, assisted, supplemented, or otherwise enhance the functionality of the devices by the computer processor, other hardware, and/or one or more software programs executed by the computer processor. Computer-assisted appliances can include a wide-variety of computer-assisted machines and products including refrigerators, washing machines, automobiles, HVAC systems, industrial machinery, ovens, security systems, and so on.
- Attributes of user computing devices, computer-assisted appliances, servers, and computing devices generally, can vary widely from device to device, including the respective operating systems and collections of software programs loaded, installed, executed, operated, or otherwise accessible to each device. For instance, computing devices can run, execute, have installed, or otherwise include various sets of programs, including various combinations of operating systems, applications, plug-ins, applets, virtual machines, machine images, drivers, executable files, and other software-based programs capable of being run, executed, or otherwise used by the respective devices.
- Some system devices can further include at least one graphical display device and user interfaces, supported by computer processors of the system devices, that allow a user to view and interact with graphical user interfaces of applications and other programs provided in system, including user interfaces and graphical representations of programs interacting with applications hosted within the system devices as well as graphical user interfaces associated with application management server services and other applications, etc. Moreover, while system devices may be described in terms of being used by one user, this disclosure contemplates that many users may use one computer or that one user may use multiple computers.
- While
FIG. 1 is described as containing or being associated with a plurality of elements, not all elements illustrated withincomputing environment 100 ofFIG. 1 may be utilized in each alternative implementation of the present disclosure. Additionally, one or more of the elements described in connection with the examples ofFIG. 1 may be located external to computingenvironment 100, while in other instances, certain elements may be included within or as a portion of one or more of the other described elements, as well as other elements not described in the illustrated implementation. Further, certain elements illustrated inFIG. 1 may be combined with other components, as well as used for alternative or additional purposes in addition to those purposes described herein. - Turning now to the example block diagram of
FIG. 2 , an example system is shown including anapplication manager 205,user system 210, among other computing devices and network elements including, for instance,application servers 140 andinformation servers 145 communicating over one ormore networks 130. In one example implementation,application manager 205 may include one ormore processor devices 215,memory elements 218, and one or more other software and/or hardware-implemented components. For instance, in one example implementation, anapplication manager 205 may include ashare engine 220,user manager 222,healing engine 225,behavior analysis engine 228,application intelligence engine 230, among other potential machine executable logic, components and functionality including combinations of the foregoing. - In one example, a
share engine 220 can be configured to provide functionality for managing crowdsourcing of information relating to applications (e.g., made available by application servers 140), as well as the sharing of such information and resources, including resources generated at least in part by or collected byapplication manager 205. For example, anexample share engine 220 can allow modifiedapplications 232 developed for particular users and associated user devices (e.g., 210) as well as definedapplication modes 240 to be shared across multiple user devices (e.g., 210), among other examples. Anexample user manager 222 can provide functionality for managing user accounts of various user devices (e.g., 210) that consume or otherwise make use of services ofapplication manager 205. Anexample user manager 222 can associate various modifiedapplications 232, application data and feedback data (e.g., 235), andapplication modes 240, including application modes developed or modified by particular users with one or more user accounts and user devices (e.g., 210) in a system, among other examples. - An
application manager 205 can, in some implementations, additionally include components, engines, and modules capable of providing application management, security, and diagnostic services to one or more user devices (e.g., 210) in connection with user device attempts to download, install, activate, or otherwise use or procure various applications including applications provided through one or more application servers (e.g., 140). For instance, in one example implementation,application manager 205 can include an examplebehavior analysis engine 228 adapted to analyze and identify functionality of various applications made available to user devices on the system. Further, functionality of applications can be identified, for instance, bybehavior analysis engine 228, that users or administrators may wish to block, limit, repair, or modify, among other examples. Accordingly, in some implementations, anexample application manager 205 can include anexample healing engine 225 configured to modify applications on behalf of users to eliminate undesirable application features detected, for example, bybehavior analysis engine 228 and thereby generate modifiedapplications 232.Modified applications 232 can, in some examples, be specifically modified and configured based on the requests, rules, settings, and preferences of a corresponding user. Additionally,application manager 205 may include anapplication intelligence engine 230 configured to collect application data (e.g., 235), for instance, frominformation servers 145 and other sources both internal andexternal application manager 205 and its client user devices (e.g., 210). Anapplication intelligence engine 230 can be used to collect intelligence regarding one or more applications served, for instance, by application servers 144. The intelligence can be used in connection with services provided byapplication manager 205, such as behavior analysis and assessments of applications byapplication manager 205, among other examples. - In some implementations, a user device (e.g., 210) may include one or
more processor devices 242 and one or more memory elements 245 as well as one or more other software- and/or hardware-implemented components including, for example, amode manager 248,settings manager 252,security tools 250, and one or more applications 255 (e.g., procured through application servers 140). In one example implementation, auser device 210 can include amode manager 248 that is equipped with functionality for defining, enforcing, and otherwise managing multipleapplication access modes 265 on theuser device 210. Mode rules 270 can additionally be managed bymode manager 248, the mode rules 270 defining, for instance, particular conditions for automatically initiating or enforcingvarious modes 265 on theuser device 210. Additionally one ormore settings 260 can be defined by users, for instance, through anexample settings manager 252, the setting corresponding to and in some cases used in connection withvarious modes 265 of thedevice 210, among other examples. - Turning to the example of
FIG. 3 , a simplified block diagram 300 is shown illustrating functionality and flows of an example application manager. For example, a behavior monitor 228 can assess applications to identify whether one or more functions and/or content of an application are good, bad, suspect, or of unknown quality, among other examples. The assessment can be based on information acquired from a variety of sources (e.g., 145), such as information servers, user feedback, and other sources. In instances where “bad” application functionality and/or content is identified anapplication healing engine 225 can be engaged to modify the application and remediate the identified undesirable functionality to generate a modifiedapplication file 232 corresponding to a healed version of the application. Further, suspect or unknown applications can be designated, for instance, by amode manager 248, to be dedicated to a particular limited access mode of theuser device 210 so as to, in effect, quarantine the suspect application until more intelligence is acquired regarding the application's functionality. In instances where it is determined that an application satisfies rules, requirements, or preferences of a user, network, administrator, etc., the application may instead be allowed to proceed for installation on a user device. Further, applications which have been healed to generate a modified application file can allow for the modified application to proceed to the user device for installation on the device, among other examples. -
FIG. 4 includes a block diagram 400 illustrating example principles and activities enabled through an example application behavior analysis engine.Application binaries 405 can be accessed or received by a disassembler data/control flow analyzer 410 which, in combination with ambient application knowledge 415 (e.g., collected from outside information sources as well as users, reviewers, etc.) such as application descriptions, reviews, comments, and other structured and unstructured data, can develop a model of theapplication logic 420 for eachapplication binary 405. The disassembler andcontrol flow analyzer 410 can identifybehaviors 425 of the given application based on, for example, comparing code or application logic model with known functionality defined in or identifiable from a software development kit and/or common APIs utilized by the corresponding client device operating system as well as most or all applications compatible with the client device. Some examples include the Google Android software development kit, Apple iOS software development kit, Windows software development kit, among other examples. - Generally, a platform software development kit (or “SDK”) can provide documentation, header files, libraries, commands, interfaces, etc. defining and providing access to the various platform subsystems accessible by applications compatible with the platform. In one example implementation, a platform SDK and corresponding APIs and API calls (i.e., calls to functions and routines of the API) can be represented in a model that can be used, for instance, by an application behavior engine, to determine behavior and functionality of applications compatible with the platform. The semantics of commonly used APIs is represented in a program readable form along with critical information necessary to derive application behavior. The semantics of the platform SDK can be represented so that an example application behavior engine can use the semantic model to understand and identify the operations and behaviors of a given application using the API call. For example, in one example implementation, all of the potential API calls of the platform can be represented, for instance through
API intelligence 430, by tagging the name of each respective API call with the behavioral tag describing what the respective API call does on the platform as well as the corresponding parameters of the API's operations and behaviors. As an example, a template of such a semantic representation can be modeled, for instance, as: -
<APIName: name <Category: read/write/process/transform/.../...> <CategoryDetail> <Reads: sensitivity> <Writes: sensitivity> <Transform: sensitivity> <Senitivity: red:5/orange:4/yellow:3/green:1> <Parameters: No of paramerer> <ParameterIndex:Index> <Type: integer/object/string/../..> <Operation: input/output/transformative> <return value: void/integer/object/string/> <Dependency> <True/False> <Description> <APIDescription: description of the API> <Verbs:xxx> <Nouns:xxx> - In the foregoing example, a “category” can designate the type of an API call and be used to identify the general functionality of such API calls, such as, that the API call reads information from a particular subsystem, disk, etc. generates various messages, initiates various network behaviors, attempts to communicate with various outside servers, triggers particular device functions or elements (e.g., a camera, SMS controller, etc.). “Sensitivity” can represent the respective sensitivity of the subsystem affected or associated by the API in the context of the potential for malicious behavior in connection with the subsystem, such as whether reading to a particular memory location introduces the potential for spying, where the subsystem potentially permits the introduction of malware, unauthorized tracking or data collection, the unauthorized or undesired reading or sending of SMS or email messages, among many other examples. Further, “dependency” can represent whether the output of this API can have an impact on other parts of the program in a direct way. For instance, a sendTextMessage( ) API can be identified as having no dependency where the API simply sends an SMS message out and does not return anything, among other examples.
- Other information can be used by a behavior heuristics/rule engine 435 (e.g., of an example analysis engine (e.g., 228)) to determine behaviors of an application under assessment, such as global threat intelligence (GTI) 440 aggregating intelligence from a community of
sources 445,rules 450, and other information. - As noted above, an example application behavior analysis engine (e.g., 228) can possess functionality for identifying the control flows, operations, functionality, and behavior of a given application based, for instance, on a semantic representation of a standard platform SDK upon which compatible applications are based. In FIG. SA,
representation 500 of a simplified application control now is shown for an example gaming application. While the functionality of the game may be in the main desirable, secure, and benign, deeper inspection of the code of the game application binary in comparison with the semantic representation of the platform SDK as well as ambient application intelligence for the game application, may yield identification of other functionality that is not immediately or otherwise identifiable, understood, or appreciated by users, such as the application sending SMS messages either with or without a user's explicit knowledge or permission. In another example, shown inFIG. 5B , inspection of a particular object of an application binary may reveal the totality of functions and control flows of the given application as well as reveal dependencies between distinct programs, program units, or applications the user may not otherwise realize, understand, or approve of. As an example, identified behavior heuristics can be represented externally, in some implementations, in an XML file that identifies the specific pattern of data flow and calls, from which the behavior can be identified. For instance: -
<Pattern> < Call to API1( ): mandatory < Call to API2( )/API3( )/....: mandatory> < Call to API5( )/API6( )/....: optional> < Call to API10( ): mandatory> </Pattern> - In some implementations, based for instance on a model of the semantic representation of the platform SDK, application logic can be modeled and rules can be applied to interpret the application logic and identify instructions and calls within a corresponding binary of the application that correspond with malicious, privacy infringing, policy violating, or other undesirable behaviors. The logical model of an application's functionality can include representation (e.g., 505) of the application logic through data flow structures and control flow structures, among other examples. A dataflow structure can represent the lifetime of data objects as they pass-through the application logic (e.g., 510) and onto other program units (e.g., 515) including external program units. A dataflow structure (e.g., 505) can be used to identify the flow of data from one part of the application program as it moves and is potentially transformed by the application logic. For example, a dataflow model can be used to deduce that particular data is being leaked by the application through an Internet communication post operation, among other examples. Further, control flow structures can represent the control flow of different function calls (e.g., 520, 525) to identify an originating source of an application call determined to be sensitive or undesirable. As an illustrative example, a call by the application to send an SMS message can be traced back, for example, to a UI element of an application interacted with by user, or even an autonomous event in a background process of the application, among potentially many other examples.
- Turning to the examples of
FIG. 6 , a simplified block diagram is illustrated representing various subsystems, devices, and functionality accessible by applications through one or more APIs defined in a platform SDK, for example. In some implementations, all platform subsystems can be categorized or assigned weights based on the sensitivity of the respective subsystem in the context of the potential that the subsystem could be manipulated or utilized in connection with a malicious or otherwise undesirable behavior. Such weights and sensitivities can be based on a variety of factors including, for example, the potential for an invasion of privacy, data leaks, financial sensitivity, among other examples. These factors can also form the basis of categorizations of the various subsystems of the platform. Such subsystems can include, for example, contact lists, photo galleries, email clients, calendars, Internet connectivity and browsing, graphics, video functionality, cameras, audio, security tools and engines, telephony, Wi-Fi capabilities, Bluetooth capabilities, data ports, battery power, touchscreens, global positioning systems, among potentially many other functionalities and subsystems including future functionality that can be integrated in mobile devices. - As represented in the example of
FIG. 7 , a rule engine of an application behavior analysis engine can access rules, for instance, from a rule database, including rules that have been custom defined for and/or by a particular user or set of users according, for example, to preferences of the users as well as policies applicable to the users (e.g., policies of an Internet service provider, enterprise network, broadband data provider, etc.). The rule engine can take as a further input an application logic model (e.g., developed based on a semantic representation of a platform SDK corresponding to the application) to assess the various operations and functionality of an application as identified in application logic model. The rule engine can assess the various operations and functionality of an application according to rules identified as applicable to the particular instance of an application, such as an instance of an application that has been attempted to be downloaded or installed on a particular user computing device of a user associated with the identified rules. Application behaviors can be identified by the rule engine including application behaviors identified as violating one or more rules (e.g., rules forbidding certain behaviors or actions) and prompting, in some instances, remediation of the identified application behaviors and/or assignment of the application to one or more operation modes on the destination user device, such as a quarantine or administrative operation mode, among other examples. - In some implementations, a human readable description of a behavior identified and based on a description of API semantics can be constructed. In one example, human relatable verbs and nouns can be associated with template messages in the semantic representation and mapped to particular human understandable descriptions of functions and operations available to the APIs. Further, in connection with assessments of an application according to the semantic model performed, for example, by an application behavioral analysis engine, a human-readable summary of the behavior analysis results can be generated from the mapping and presented to a user that describes the various functionality, as well as, in some implementations, the control flow dataflow of the analyzed application. Such results can make use of the human readable description to generate a description of the functionality uncovered during analysis of the application, including functionality that may otherwise be invisible to or difficult to detect by the user. For example, in one implementation, the template can be utilized and populated so as to identify and describe an example application's functionality for reading SMS data from the user's device. As an illustrative example, corresponding description could be generated such as: “This application reads your SMS data from SMS inbox and sends to a web site.” Such a description could be constructed, for example, by filling in an example template based on the semantic representation of the platform SDK and APIs such as: “This application <verb: reads> your <noun:SMS data> from <noun: SMS inbox> and <verb: sends> to a <noun: website>”, among other examples.
- In some examples, the analyzed application behavior can reveal the use of other applications, programs, or services by the analyzed application. Some instances, a call to a local application, remote service, or other program by the analyzed application may be undesirable, for instance, when the other called application is identified as unsecure, un-trusted, or unknown, among other examples. In other instances, a program called or used by the analyzed application may be identified as a trusted program. Accordingly, in some implementations, an application behavior analysis engine can make use of, generate, modify, and otherwise manage whitelists and/or blacklists that identify the status and reputations of various programs that have been known to or could be potentially called by various analyzed applications. In some implementations, applications and services hosted by remote servers can additionally be identified in such whitelists and/or blacklists by the respective URLs or other address information corresponding to their respective host servers, among other examples.
- In some implementations, the behavioral analysis engine can identify the context in which a particular activity is performed, platform API is accessed, or functionality is employed by the application under assessment. As an example, an analyzed application's attempts to access a platform telephony subsystem can be assessed based upon the cause or context of the attempt. For instance, in some contexts, a particular API call may be perfectly acceptable while in other contexts the API call can be undesirable. For instance identified application functionality that accesses the telephony subsystem in response to a user interface interaction, such as a button press, may be assessed differently than an attempt by an application to access the telephony subsystem autonomously and not in response to a user provided directive, among other examples.
- As noted above, in some implementations, rules can be defined that can be used in the assessment of application behaviors. Such rules can be represented and configured for use in performing heuristic analysis of an application's logic or of a potentially malicious behavior identified by an application behavior analysis engine, including contexts in which the behavior is to be determined to be malicious. For instance, a rule engine can apply one or more rules to an application logic model to identify one a more potentially malicious or otherwise undesirable behaviors present in the application. In some implementations, a rule can be represented as:
-
<Rule> <Run><Dataflow><ReadOperation>of <red sub system>to a<WriteOperation> of <write sub system>
The rules can be generic or can be specific to a particular subsystem, etc., such as a rule to detect data leak of a memory element storing personal contact data, among other examples. A specific application behavior can be derived based on application of a single rule or multiple rules. - In some implementations, an application behavior analysis engine can be hosted on one or more server computing devices remote from the mobile user devices for which analysis performed. In other examples, at least a portion of application behavior analysis engine can be provided alternatively or redundantly with functionality of server-side application behavior analysis engine components. For instance, in one example implementation, a user computing device can be provided with application behavior analysis engine functionality allowing at least a partial or quick preliminary assessment of an application to be performed at the user device to thereby provide a user with fast feedback as well as assess whether an application should be quarantined, denied download or installation, and/or forwarded to a remote application behavior analysis engine, such as one provided in a cloud system, allowing then for a more robust behavioral analysis of the application (that could possibly introduce increased latency into the behavioral analysis assessment).
- In some implementations, during an analysis of an application, downloading, insulation, or launching of the analyzed application may be prevented or delayed until the analysis is completed. In some instances, a user can be provided with a prompt identifying the analysis of the application as well as providing the user with various options for dealing with the installation, downloading, or launching of the analyzed application. For instance, a user may be provided with the option of skipping the analysis, delaying installation of the analyzed application, assigning the analyzed application to a particular mode, among other examples. Additionally, in some implementations, a prompt presented to the user in connection with the assessment may be presented together with information, such as preliminary information, gleaned from the behavioral analysis engine assessments and/or external intelligence relating to the analyzed application. Such intelligence can include, for example, intelligence gleaned by the behavioral analysis engine in previous assessments of the analyzed application, among other examples. Indeed, in some implementations, the behavioral analysis engine can indicate to the user behaviors discovered for the application, how other users have responded to feedback received from the behavioral analysis engine regarding the particular analyzed application, among other examples.
- In some implementations, behavioral analysis engine can maintain blacklists, greylists, and/or whitelists of applications known to and/or previously analyzed by the behavioral analysis engine. Such blacklists, greylists, and/or whitelists can be based on historical intelligence collected from previous behavioral analyses, outside intelligence from other sources, and other users. The behavioral analysis engine can utilize such information to perform an initial assessment of an application and leverage information gleaned from previous analyses. Initial filtering or feedback can thereby be provided to a user to assist the user in determining how to deal with a particular application as well as whether to initiate further behavioral analysis on the application using the behavioral analysis engine.
- Behavioral analysis of applications and/or blacklists/whitelists can further incorporate or consider general reputation information of developers or other parties identified as responsible for various applications, among other examples and considerations. Rules can be defined that consider the trustworthiness or untrustworthiness of the developer, distributor, etc. of an application. For example, an application development score rating can be computed for a developer based on aggregate analyses of applications of the developer by the behavioral analysis engine. For instance, such a rating can be derived as: AppDeveloper Rating=f(total number of apps, weighted average of undesired behavior in apps, popularity of the app, average ratio of low ratings), among other examples. For instance, in one Illustrative example, a weighted average of undesired behavior can be generated for a set of applications of a developer:
-
Weight No of Total Behavior (out of 10) occurrence weight Contacts leakage 9 2 18 Device ID leakage 2 5 10 Message Leakage (SMS) 8 3 24 Location leakage 5 4 20 Unnecessary permissions 2 1 2
and average weight can be derived by Average Weight=Total Weight/Total number of Apps, among other example implementations. - Outside sources, such as intelligence databases, such as a global threat intelligence (GTI) feed, can be used for identifying malicious behaviors that have been detected across one or more networks that may be employed by applications assessed by behavioral analysis engines. For instance, various URLs, IP addresses, phone numbers, and files can be identified that have been previously determined to be associated with or used in other malicious attacks, malware, or suspect systems. Additionally, a behavioral analysis engine can interface with intelligence databases to provide additional intelligence gleaned from the behavioral analyses of applications performed by the behavioral analysis engine itself, among other examples.
- Further, in some systems and platforms, applications offered by one or more application servers or storefronts may provide users with basic descriptions, ratings, user feedback, etc. collected for a given application. Unfortunately, in many instances, such ratings, application descriptions, content ratings, etc. may be provided by, manipulated by, or otherwise influenced by the application developers themselves thereby diminishing, potentially, the truthfulness or legitimacy of the information provided to users regarding some applications. Accordingly, in some of implementations, intelligence (e.g., behavioral descriptions) gleaned from behavioral analyses of applications performed by an example behavioral analysis engine may be used to supplement, correct, or otherwise modify descriptions provided to users in connection with their browsing, purchasing, and downloading of applications available on a platform. Further, in some implementations, a behavioral analysis engine can make use of these default application descriptions, content ratings, user feedback etc. as external intelligence considered in connection with a behavioral analysis. In still other examples, a behavioral analysis engine may be used to identify common behavioral traits between multiple applications that can serve as the basis for categorizing the applications according to behavior. Such categories can then be provided to users to assist users in better understanding the qualities and behaviors, as well as potential risks, of various applications, among other examples.
- Turning to
FIG. 8 , a simplified schematic diagram 800 is shown of an example flow for performing deep analysis of application behavior (e.g., using a behavioral analysis engine) and performing application healing in an attempt to remedy those behaviors determined to be undesirable in an application while still preserving other core functionality of the application, in some examples. As shown, application binaries can be submitted to a disassembler and data control flow analyzer 410 (e.g., of a behavior analysis engine) to develop application logic models (e.g., 420) based, in some examples, additionally onambient application knowledge 415, intelligence, and the like. As noted above, the model ofapplication logic 420 can be assessed based on defined rules, platform API intelligence, and behavioral heuristics through a behavioral heuristics/rules engine 435 to identify application behaviors of a respective application. Further, sections of code of the application can be identified during the assessment as responsible for the exhibited undesirable behavior. This code can be flagged for remediation. Additionally, in instances where application behaviors are identified as undesirable and are requested or dictated, by a user, administrator, or predefined rules, to be healed, the application binaries can be further processed to remove, block, or otherwise remediate the offending behaviors and corresponding code to thereby generate healedversions 232 of the application binaries that a user can then cause to be downloaded, installed, and executed on the user's device. Additionally, as noted above, the global threat intelligence feed 440 or other intelligence database can provide intelligence for consideration and behavioral analyses as well as application healing. Additionally, intelligence gleaned from the behavioral analyses can be shared with outside intelligence databases that additionally receive input, data, and intelligence from a community of users andsystems 445. - Turning now to the example of
FIG. 9 , anadditional flowchart 900 shown representing decisions made in connection with the management and remediation of applications determined to include undesirable behaviors based on behavioral analyses of the applications. For instance, rules and policies can be defined, for instance, by a user or system or network administrator, to define how and under what conditions applications are to be handled that have been determined to include one or more undesirable behaviors. Such policies can, for example, identify particular types of undesirable behaviors and map such behaviors to predefined courses of action, such as the healing or remediation of the applications, blacklisting or whitelisting of the applications, quarantining of the applications, among other examples. Additionally, user inputs can drive management of an application's deployment on a user computing device. Such inputs can be received in connection with prompts presented to the user and can include, for example, requests to remediate one or more identified undesirable behaviors, instructions to assign the analyzed application to a particular operation mode or quarantine area, among other examples. - As noted above, static healing and personalization of application behavior can be performed by a healing engine allowing the code of the application to be modified and generate a “safe” version of the application that allows the user to retain safe or legitimate functionality of the application while removing undesirable behaviors. Such healing can in some cases be personalized or customized to particularly-defined policies driving the healing, thereby allowing a user, service provider, device manufacturer, etc. to control and personalize the functionality of applications to be installed on corresponding user devices. In
FIG. 10 , simplified b diagram 1000 is illustrated showing the flow of an example healing of anoriginal application 1005. Upon identifying 1010 undesirable behaviors and offending sections of the code of the application binary, a healing engine can be provided for identifying, removing, replacing, or blocking, the offending code and corresponding behaviors in order to generate a modifiedapplication binary 1015. As an example, ahealing engine 228 may include logic for modifying an application by removing or blocking various types of undesired behaviors such as, in this example, unauthorized reads or accesses of SMS functionality by removing the offending instructions discovered in the original application binary. In other instances, such as shown in this example, a healing engine may modify the offending code, such as by rewriting the code to redirect an API call to a trusted system, destination, address, etc. Ahealing engine 228 can modify the original code with minimal changes so as to avoid affecting the core desired functionality of the application. Further, healing policies can identify the patterns that are considered for identifying application code for healing. This can be represented, for example, in an XML file that identifies the heuristic pattern of code corresponding to an offending behavior. Each type of defined or identified pattern of code can be healed by a specific healing method, such as according to corresponding policies. Such methods can be identified and defined in such a way that the healing does not impact the rest of the application's functionality. - A variety of healing methods can be employed by an application healer engine. For instance, a particular offending line of code functionality can be identified as a final or leaf node in a control chain. In such instances, the offending code may be determined to be able to be suppressed or removed without affecting other dependencies in the application, among other examples. In another example, if a removal of a particular API call is determined to likely have no impact on surrounding code, the removal healing method can be applied. The nature and character of APIs can be learned, for example, from the semantic platform SDK representation, among other examples. In other instances, the offending behavior can be from one or more sections of code and may result in multiple methods of healing applied to remediate the behavior, such as by replacing the data in a register to alter the behavior of the API or redirecting of the API call to a new version of the API with same interface by replacing the offending API code with the new API code, among other examples. In instances where a new version of an API is introduced, the new API may, for example, do nothing and set the register status so as not to impact other parts of the program, process the inputs in a different way to avoid the undesired behavior, or do pre-processing and/or post-processing of the input/output parameter and call the original API, among other example techniques that resolve the undesirable behavior.
- Turning to
FIG. 11 , a simplified block diagram is illustrated showing the identification of code relating to particular undesirable behaviors. For instance,sections sections - In
FIGS. 12A-12E , additional examples are illustrated of the detection of undesirable behaviors as well as the remediation of the undesirable behaviors. For example, inFIG. 12A , an example code fragment allowing an application to send latitude and longitude information to an outside server is shown as having been processed to populate an API template, for instance, utilizing a behavior analysis engine. As shown inFIG. 12B , portions of the application code can be identified that correspond to the behavior of collecting geo-positional data and sending the geo-positional data to the outside server. In accordance with one example, the offending lines of code can be replaced, for example with code that masks or redirects the sending of the geo-positional data to prevent the application from tracking user location, among other examples. In another example, illustrated inFIG. 12C , a control flow can be identified within an application along with corresponding application code. As shown in the examples ofFIGS. 12D-12E , remediation of a particular undesirable behavior can include deletion of an offending line of code, among other examples. -
FIG. 13 illustrates anexample flow 1300 in connection with remediation of one or more detected undesirable behaviors of an application. For instance, the connection with the dynamic personalization of an application's behavior for particular user, the composite behaviors of the application and corresponding code segments can be identified. A user interface can be presented in connection with the healing or customization of the application allowing the user to select particular identified behaviors for remediation or modifications. In one example implementation, the user interface can be provided in connection with an application healing engine with the user inputs directing how (e.g., which identified behaviors) the application healing engine is to modify the application. In another example, application healing engine can insert one or more user interface controls into the original binary of the application allowing the user at launch of the modified application to dynamically enable, disable, or otherwise remediate or customize the behavior of the application. For instance, based on the selections of the user, an original section of the code corresponding to an accepted behavior can be utilized in lieu of a healed version of the same code, among other examples. Effectively, each of the segments of the code where behavior is demonstrated can be selectively turned off or on based on the user preferences and inputs. Further, the user interface can provide a user with the option of saving the settings of an application so that the selection of a particular subset of application behaviors persists and is available the next time the application is launched on the user's device. - In some implementations, functionality can be provided to define, enable, and employ defined usage modes on the user devices. Traditionally, user devices, such as smart phones and tablet computers, among other examples, are designed to support a single user and application profile. However, a single operation profile and mode may not be appropriate for all of the actual users of the device or the situations in which the device is used. For instance, a user may desire to loan their device to a friend for some short period of time, but would like to nonetheless retain control of the access to some of the sensitive applications and data on the device, email applications, contacts, calendars, messaging functionality, etc. In other instances, the user may desire to allow a child to temporarily use the device, for example, to play game, but would prefer for other applications (e.g., web browsers) and access to certain device settings and data to be blocked from the child. Additionally, users may desire to control usage of some subset of the applications on the device to specific times, locations, and situations. For instance, games and social networking applications may be desired to be disabled during school hours, among other examples.
-
FIG. 14A illustrates a simplified block diagram 1400 a of an example implementation of a mode manager. For instance, various modes may be defined based on intelligence gleaned from the user device as well as outside services. A user may define one or more modes through a user interface and a mode manager, for instance, on the device may manage access to the various modes, for example, using dedicated credentials assigned to each of the modes. Additionally, as noted above, an application monitoring service or application behavioral analysis engine may recommend particular applications for a quarantine or high-security mode available on the user device. Accordingly, a user may define such modes to restrict access to potentially risky or currently analyzed applications to administrative, adult, or other trusted users, among other examples. -
FIG. 14B illustrates another simplified block diagram 1400 b illustrating principles of an application mode manager. Anapplication mode manager 248, in some implementations, may include various modules and functionality such as amode setup manager 1405,lock service 1410,lock manager 1415,credential manager 1420,application access manager 1425,application protection service 1430,password engine 1435, among other examples. For instance, in the illustrated example, the user with administrative privileges can set up passwords or PINs and assign these credentials to modes defined by the user, for instance, using a mode setup manager. An access manager can utilize a credential manager to verify whether valid credentials have been received that allow a current user of the device to access one of a set of modes defined for the device. In the event that incorrect credentials are entered, a lock manager can invoke a lock service to lock out the current user from one or more applications by assigning the user to a restricted mode or locking out the user altogether. - In some implementations, a device mode can be composed of an exclusion list or inclusion list. Device modes can be defined as respective sets of applications that are either allowed or somehow protected in that mode, in the sense their usage is prohibited or limited. In some instances, an exclusion list can be defined for a mode that indicates a particular subset of the applications and/or subsystems of a device that are accessible under the corresponding mode (i.e., with the remaining applications protected or locked in that mode). For instance, a mode can be defined such as according to: <ModeName, inclusion/Exclusion, Access PIN, App1, App2, App3 . . . App N>. In some instances, each device mode can be protected and associated with a particular password. The master mode can be defined that allows access to the entirety of the device's functionality and applications. Accordingly, a master password can be provided that enables access to the master mode. Within the master mode, the user may be provided with access to a management console for managing the set of modes available or defined at the device. Accordingly the user may edit or define modes through the management console, as well as activate or delete predefined modes. An example management console can allow a user to select, from a listing of applications, those applications the user wishes to designate as protected or accessible in any given mode. In some cases, a single application can be allowed or protected under multiple different modes.
- In some implementations, mode passwords may be stored in encrypted memory. For instance, the password of each mode can be encrypted using a key generated by the same password. A stored, encrypted password can then be validated by decrypting the password with a key generated from the password entered by the user. The decrypted data can then be compared with the user-entered password. Based on the password provided by user, a corresponding mode can be identified and authenticated to allow access to the mode by the user. In some implementations, the user may manually lock the device or the device may lock itself, for instance, after a prolonged period of inactivity. When attempting to unlock the device or wake up the device a user may be again presented with a login prompt requesting a password of one of the modes available and defined for the device.
- In some implementations, modes can be hierarchical. For instance, a user logged into a higher level mode (i.e., a mode providing a relatively greater level of access), may be able to freely move to another mode without providing credentials for that lower-level mode. On the other hand, a user who has been authenticated to a lower level mode may be forced to enter additional credentials when attempting to access another mode at a higher level in the hierarchy than the lower-level mode to which the user was previously authenticated. For example, in one instance, four device modes can be defined where:
- Mode1 is admin level mode;
-
Mode 2 guest level mode; - Mode 3 is guest level mode; and
- Mode 4 is low privilege mode
- and the hierarchy is defined as: Mode1>(Mode2 and Mode 3)>Mode 4, where Mode2 is the same level as Mode3, among other example implementations.
- In some implementations, configuration of the device can be altered, customized, or at least partially restricted when certain modes are active. For example, a particular mode can activate or deactivate GPS functionality, data access, telephony, as well as certain applications. Further, in some examples, device modes can be provided that secure data of particular applications when mode. For instance, once a new mode has been created and assigned a corresponding access level to set of applications, the data of these applications may be protected by encryption through a separate encryption key. This can be implemented for example by using an encrypting file system for encrypting files and folders, among other examples.
- In some implementations, the executable code of applications can be secured to protect against applications being used in modes that disallow access and/or use to one or more of the behaviors or features of the application. For instance, in one implementation, the application executable can be stored in encrypted secondary storage. An operating system loader of the user device can gain conditional unencrypted access to the executable code, in some examples, only if the application is found in an allowed application list for the active device mode in which access to the application is attempted, among other potential implementations.
- In some examples, defining multiple device modes for a user device can further result in the provision of multiple unique home screens to be presented in each of the corresponding modes. As a result, in such implementations, the appearance of a given home screen can indicate to a user the mode that is active on the device as well as access privileges available in that mode. In some instances, home screens can include icons of applications that are available within that corresponding mode, hiding or obscuring the icons of other applications that are protected within that mode, among other examples.
- Further, in some instances, device modes can be created automatically, for instance, based on identified behaviors and security profiles of applications that are detected or loaded on the user device. For instance, a mode manager can make use of behavioral analyses performed, for example, by an example application behavioral analysis engine, to identify applications that exhibit a common category of behaviors or category of security profiles. For instance, applications identified as permitting access to online resources may be grouped and assigned dynamically to one or more modes that have been defined as allowing such access. Other modes, such as modes dedicated for underage users, may be denied access to applications that allow users to access the Internet, among other examples. Other example categories may include applications that enable telephony or mobile messaging functionality, applications that make use of subsystems that utilize sensitive data, collect potentially private information (e.g., cameras, voice recorders, GPS systems, etc.), and other examples. In some implementations, ambient intelligence relating to an application, such as an age rating (e.g., 7+, 12+, 18+ years, etc.), user reviews, or other information may be used to categorize applications and group them in various modes. For example, a description of an application may include an age or maturity rating as well as reasons for the maturity rating. Accordingly, in one example, one or more modes may be defined, for example, that block access by child users to applications with higher maturity ratings, among other examples.
- Other global or distributed intelligence can also be used to develop information for a given application, such as illustrated in the simplified block diagram 1900 of
FIG. 19 . For instance, application information can be constructed from security information regarding behaviors of an application fromglobal threat intelligence 440, publisher/developer reputation information 1905, app store feedback andreviews 1910,behavior analysis results 1915, among other examples. Such information (e.g., 440, 1905, 1910, etc.) can be used in combination withbehavioral assessments 1915 of the applications (e.g., whether an application potentially leaks data, provides location information, enables SMS messaging, etc.) to assign certain applications to particular device modes, such as quarantine or administrative modes, among other examples. A user may further designate custom categories or behaviors or select pre-defined categories or behaviors as the basis for assignments of applications to respective modes rather than individually selecting the applications for inclusion in one or more modes on al a carte basis, among other examples. - Turning to the example of
FIG. 15A , an example algorithm is represented for the storing of password information associated with a particular mode.FIG. 15B represents an example algorithm for validating a password and identifying a mode to activate that corresponds to the entered password. It should be appreciated that the algorithms ofFIGS. 15A-15B are non-limiting examples presented merely for purposes of illustration and that other alternative algorithms and implementations can be utilized in other instances. - Turning to the example of
FIG. 16 , in some implementations, modes defined by a given user may be provided, for instance, to an application management service, cloud service or other service (e.g., 1600) that allows one or more modes, as well as rules associated with the modes, to be aggregated and shared with other users. Additionally, shared device modes maintained by amode sharing service 1600 can be browsed and selected for download and utilization onuser devices - In some implementations, such as shown in the example of
FIG. 17 , modes can be activated automatically based on context information detected, for example, by the device itself. A user, in some examples, can configure (e.g., on the management console), rules for automatically activating particular modes. For instance, a particular mode can be activated automatically in response to the detection of a specific context at the user device. Such contexts can include, for example, detecting the location or proximity of the device within a defined geo-fence, detecting that the device is in proximity of other devices, detecting the device in range of particular data networks, detecting a user of the device (e.g., based on user biometric information collected by the device), a detected time of day, device battery status, usage activity (e.g., to guard against particular users spending too much time on the device, etc.), whether the device is traveling or in motion (e.g., as detected through GPS functionality, accelerometers, or other functionality on the device), among potentially many other examples. - Turning now to the example of
FIG. 18 , in some implementations, modes can be provisioned and configured through a remote service, such as a cloud service, allowing a user to activate/deactivate or define a mode remotely. Using such a service, a user can create a mode remotely (e.g., using a computer other than the target mobile user device) and provision one or more modes to the target user device and also activate and deactivate the mode on the user device from a remote location. Further, an administrator can also use the service to provision such modes on mobile user devices as well as define rules and contexts for automatically activating, applying, or deactivating a given mode, among other examples. -
FIGS. 20A-20D illustrate example screenshots of user interfaces showing particular features of some example implementations of mode management on a mobile user device. For instance, screenshot inFIG. 20A illustrates a user interface for defining a new mode and mode password. A similar user interface can be provided to allow a user to select and activate one of multiple available modes on the device and/or provide credentials for the selected mode. In some implementations, a user device may include native login credentials or a native login manager. A mode manager may be implemented as an application itself that overrides a native login manager and replaces a native login screen with the mode-specific login prompts (e.g., that allow the multi-mode functionality of the user device). In some instances, a user may not be able to visually distinguish that a user device is provisioned with multiple modes, with the login screen capable of accepting one of a plurality of different login codes, each login code corresponding to a supported mode (including hidden modes) provisioned on the user device. - The screenshot of
FIG. 208 illustrates a view of a home screen for a particular mode. As shown in this example, a set of restricted applications can be designated that can only be accessed by providing credentials to and activating a higher level mode (e.g., that permits access of the restricted applications). Further, a My Apps folder can provide access to those applications that have been enabled in a current active mode. Screenshot ofFIG. 20C provides another view of an example administrative screen that permits users to activate, edit, or create new modes. Additionally, example screenshot ofFIG. 20D illustrates a user interface that can be provided in some implementations of a mode manager allowing a user to designate from a list of applications on the device which applications are to be included or protected in a given mode, and so on. It should be appreciated that the foregoing examples are provided merely for the sake of illustrating certain principles and should not be interpreted as limiting examples. Indeed, a variety of different implementations, user interfaces, program architectures, operating systems, SDK platforms, and method sequences can be substituted for those examples described above without diverting from the general principles illustrated and described in this Specification. -
FIGS. 21A-21C are flowcharts 2100 a-c illustrating example techniques in the management of applications on mobile user computing devices. For instance, in the example ofFIG. 21A , code of a particular application can be analyzed 2105, for instance, against a semantic representation of a platform, such as a representation of a platform SDK and/or APIs. A set of behaviors of the particular application can be identified 2110. At least one undesirable behavior in the set of behaviors can be identified 2115, for instance, based on the user selection of one of the identified set of behaviors or automatically according to rules and/or policies defined (e.g., by a user or administrator) for applications to be downloaded, installed, launched, or otherwise used at a particular mobile computing device. - In the example of
FIG. 218 , a behavior can be identified 2120 and a set of behaviors detected for a particular application (e.g., according to the principles of the example ofFIG. 21A ). A section of code of the particular application can then be identified 2125 corresponding to the identified behavior. A remediation action can be performed 2130 on the identified section of code to automatically remediate the behavior, for instance, in response to an identification that the identified behavior is an undesirable behavior, etc. The remediation action can result in the dynamic generation of a “healed” version of the particular application that retains at least a portion of its original functionality, with the undesired functionality being blocked or stripped from the healed version. - In the example of
FIG. 21C , a particular one of a plurality of modes can be activated 2140. The modes can be defined for a particular user computing device and dictate what subset of the functionality of the computing device and its software may be accessible to a particular user having credentials for accessing a respective mode in the plurality of modes. Access can be restricted 2145 to one or more applications installed on the user computing device according to theactivation 2140 of the particular mode. In addition, in some implementations, activation of the particular mode can result in a restricted or alternate configuration of the computing device to be applied that thereby limits a user's access to one or more subsystems and functionality, including hardware functionality, and settings and data of the user computing device, among other examples. - Although this disclosure has been described in terms of certain implementations and generally associated methods, alterations and permutations of these implementations and methods will be apparent to those skilled in the art. For example, the actions described herein can be performed in a different order than as described and still achieve the desirable results. As one example, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve the desired results. In certain implementations, multitasking and parallel processing may be advantageous. Additionally, diverse user interface layouts and functionality can be supported. Other variations are within the scope of the following claims.
- Embodiments of the subject matter and the operations described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Embodiments of the subject matter described in this specification can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions, encoded on computer storage medium for execution by, or to control the operation of, data processing apparatus. Alternatively or in addition, the program instructions can be encoded on an artificially generated propagated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal that is generated to encode information for transmission to suitable receiver apparatus for execution by a data processing apparatus. A computer storage medium can be, or be included in, a computer-readable storage device, a computer-readable storage substrate, a random or serial access memory array or device, or a combination of one or more of them. Moreover, while a computer storage medium is not a propagated signal per se, a computer storage medium can be a source or destination of computer program instructions encoded in an artificially generated propagated signal. The computer storage medium can also be, or be included in, one or more separate physical components or media (e.g., multiple CDs, disks, or other storage devices), including a distributed software environment or cloud computing environment.
- Networks, including core and access networks, including wireless access networks, can include one or more network elements. “Network elements” can encompass various types of routers, switches, gateways, bridges, load balancers, firewalls, servers, inline service nodes, proxies, processors, modules, or any other suitable device, component, element, or object operable to exchange information in a network environment. A network element may include appropriate processors, memory elements, hardware and/or software to support (or otherwise execute) the activities associated with using a processor for screen management functionalities, as outlined herein. Moreover, the network element may include any suitable components, modules, interfaces, or objects that facilitate the operations thereof. This may be inclusive of appropriate algorithms and communication protocols that allow for the effective exchange of data or information.
- The operations described in this specification can be implemented as operations performed by a data processing apparatus on data stored on one or more computer-readable storage devices or received from other sources. The terms “data processing apparatus,” “processor,” “processing device,” and “computing device” can encompass all kinds of apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, a system on a chip, or multiple ones, or combinations, of the foregoing. The apparatus can include general or special purpose logic circuitry, e.g., a central processing unit (CPU), a blade, an application specific integrated circuit (ASIC), or a field-programmable gate array (FPGA), among other suitable options. While some processors and computing devices have been described and/or illustrated as a single processor, multiple processors may be used according to the particular needs of the associated server. References to a single processor are meant to include multiple processors where applicable. Generally, the processor executes instructions and manipulates data to perform certain operations. An apparatus can also include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, a cross-platform runtime environment, a virtual machine, or a combination of one or more of them. The apparatus and execution environment can realize various different computing model infrastructures, such as web services, distributed computing and grid computing infrastructures.
- A computer program (also known as a program, software, software application, script, module, (software) tools, (software) engines, or code) can be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages, and it can be deployed in any form, including as a standalone program or as a module, component, subroutine, object, or other unit suitable for use in a computing environment. For instance, a computer program may include computer-readable instructions, firmware, wired or programmed hardware, or any combination thereof on a tangible medium operable when executed to perform at least the processes and operations described herein. A computer program may, but need not, correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
- Programs can be implemented as individual modules that implement the various features and functionality through various objects, methods, or other processes, or may instead include a number of sub-modules, third party services, components, libraries, and such, as appropriate. Conversely, the features and functionality of various components can be combined into single components as appropriate. In certain cases, programs and software systems may be implemented as a composite hosted application. For example, portions of the composite application may be implemented as Enterprise Java Beans (EJBs) or design-time components may have the ability to generate run-time implementations into different platforms, such as J2EE (
Java 2 Platform, Enterprise Edition), ABAP (Advanced Business Application Programming) objects, or Microsoft's .NET, among others. Additionally, applications may represent web-based applications accessed and executed via a network (e.g., through the Internet). Further, one or more processes associated with a particular hosted application or service may be stored, referenced, or executed remotely. For example, a portion of a particular hosted application or service may be a web service associated with the application that is remotely called, while another portion of the hosted application may be an interface object or agent bundled for processing at a remote client. Moreover, any or all of the hosted applications and software service may be a child or sub-module of another software module or enterprise application (not illustrated) without departing from the scope of this disclosure. Still further, portions of a hosted application can be executed by a user working directly at a server hosting the application, as well as remotely at a client. - The processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform actions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
- Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read only memory or a random access memory or both. The essential elements of a computer are a processor for performing actions in accordance with instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks. However, a computer need not have such devices. Moreover, a computer can be embedded in another device, e.g., a mobile telephone, a personal digital assistant (PDA), tablet computer, a mobile audio or video player, a game console, a Global Positioning System (GPS) receiver, or a portable storage device (e.g., a universal serial bus (USB) flash drive), to name just a few. Devices suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto optical disks; and CD ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
- To provide for interaction with a user, embodiments of the subject matter described in this specification can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input. In addition, a computer can interact with a user by sending documents to and receiving documents from a device, including remote devices, which are used by the user.
- Embodiments of the subject matter described in this specification can be implemented in a computing system that includes a back end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the subject matter described in this specification, or any combination of one or more such back end, middleware, or front end components. The components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include any internal or external network, networks, sub-network, or combination thereof operable to facilitate communications between various computing components in a system. A network may communicate, for example, Internet Protocol (IP) packets, Frame Relay frames, Asynchronous Transfer Mode (ATM) cells, voice, video, data, and other suitable information between network addresses. The network may also include one or more local area networks (LANs), radio access networks (RANs), metropolitan area networks (MANs), wide area networks (WANs), all or a portion of the Internet, peer-to-peer networks (e.g., ad hoc peer-to-peer networks), and/or any other communication system or systems at one or more locations.
- The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. In some embodiments, a server transmits data (e.g., an HTML page) to a client device (e.g., for purposes of displaying data to and receiving user input from a user interacting with the client device). Data generated at the client device (e.g., a result of the user interaction) can be received from the client device at the server.
- While this specification contains many specific implementation details, these should not be construed as limitations on the scope of any inventions or of what may be claimed, but rather as descriptions of features specific to particular embodiments of particular inventions. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
- The following examples pertain to embodiments in accordance with this Specification. One or more embodiments may provide an apparatus, a system, a machine readable medium, and a method to analyze code of a particular application against a semantic model of a software development kit of a particular platform, identify, based on the analysis of the code, a set of behaviors of the particular application, and identify that one or more of the set of behaviors are undesired behaviors. The semantic model can associate potential application behaviors with one or more of APIs of the particular platform.
- In one example, identifying that one or more of the set of behaviors are undesired behaviors includes determining that the one or more behaviors violate one or more rules. The rules can be associated with a particular user.
- In one example, a user input identifies one or more of the set of behaviors as undesirable. The user input can be received in connection with a user interface displaying human readable descriptions of the identified set of behaviors.
- In one example, code of the particular application can be disassembled into a control flow and a model of application logic for the particular application can be generated based at least in part on the semantic model. The model of application logic can be further based, at least in part, on ambient application knowledge.
- In one example, a remediation action can be performed based on the identification that one or more of the set of behaviors are undesired behaviors.
- In one example, the code of the particular application is analyzed in connection with an attempt to implement the particular application on a particular user device.
- One or more embodiments may provide an apparatus, a system, a machine readable medium, and a method to identify a particular behavior in a set of behaviors detected as included in a particular application, identify a section of code of the particular application corresponding to the particular behavior, and perform a remediation action on the section of code to remediate the particular behavior and generate a healed version of the particular application.
- In one example, the remediation action preserves other behaviors of the particular application other than the particular behavior.
- In one example, the remediation action includes deleting the section of code.
- In one example, the remediation action includes rewriting the section of code.
- In one example, the remediation action includes adding additional code to the application to nullify the particular behavior.
- In one example, the remediation action is identified from a policy identifying a remediation pattern determined to be applicable to remedying the particular behavior.
- In one example, the remediation action includes inserting application logic allowing a user to selectively enable a healed version of the particular behavior at launch of the healed application on a user device. The user can be further allowed to selectively enable an unhealed version of the particular behavior in lieu of the healed version.
- In one example, the set of behaviors of the particular application can be detected through an analysis of code of the particular application.
- In one example, the remediation action is triggered by a user request.
- One or more embodiments may provide an apparatus, a system, a machine readable medium, and a method to activate a particular one of a plurality of modes defined for a particular user device, and restrict access to one or more applications installed on the particular user device in accordance with the activated particular mode. The restricted applications can be accessible when another one of the plurality of modes is activated.
- In one example, the particular mode is activated in response to a particular passcode entered by a user of the particular user device, where each of the plurality of modes is associated with a corresponding passcode. Activation of the particular mode can include identifying the particular mode from the plurality of modes based on the entry of the particular passcode, and authenticating access to the particular mode based on the entry of the particular passcode.
- In one example, one or more of the plurality of modes are user-defined modes.
- In one example, an alternate device configuration can be applied to the particular user device based on activation of the particular mode. The alternate device configuration can restrict access to one or more subsystems of the particular user device.
- In one example, one of the plurality of modes is an administrative modes allowing for modification of the plurality of modes.
- In one example, at least one of the plurality of modes is an instance of a mode downloadable from a mode sharing service remote from the particular user device.
- In one example, the particular mode is activated automatically based at least in part on the detection of a particular context using functionality of the particular user device.
- In one example, at least a particular one of the applications is restricted based on a defined rule for the particular mode.
- In one example, the defined rule pertains to detected behavior of the particular application.
- In one example, the plurality of modes includes a mode designated as a quarantine mode for application awaiting behavioral analysis or remediation.
- In one example, the particular mode is activated in response to a user command received at a device remote from the particular user device.
- Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
- Thus, particular embodiments of the subject matter have been described. Other embodiments are within the scope of the following claims. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results.
Claims (24)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/652,623 US10114950B2 (en) | 2012-10-19 | 2017-07-18 | Mobile application management |
US16/138,904 US11157616B2 (en) | 2012-10-19 | 2018-09-21 | Mobile application management |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
IN1215/KOL/2012 | 2012-10-19 | ||
IN1215KO2012 | 2012-10-19 | ||
PCT/US2013/065799 WO2014063124A1 (en) | 2012-10-19 | 2013-10-18 | Mobile application management |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2013/065799 A-371-Of-International WO2014063124A1 (en) | 2012-10-19 | 2013-10-18 | Mobile application management |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/652,623 Continuation US10114950B2 (en) | 2012-10-19 | 2017-07-18 | Mobile application management |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150220734A1 true US20150220734A1 (en) | 2015-08-06 |
Family
ID=50488809
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/126,866 Abandoned US20150220734A1 (en) | 2012-10-19 | 2013-10-18 | Mobile application management |
US15/652,623 Active US10114950B2 (en) | 2012-10-19 | 2017-07-18 | Mobile application management |
US16/138,904 Active US11157616B2 (en) | 2012-10-19 | 2018-09-21 | Mobile application management |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/652,623 Active US10114950B2 (en) | 2012-10-19 | 2017-07-18 | Mobile application management |
US16/138,904 Active US11157616B2 (en) | 2012-10-19 | 2018-09-21 | Mobile application management |
Country Status (5)
Country | Link |
---|---|
US (3) | US20150220734A1 (en) |
EP (1) | EP2909775B1 (en) |
JP (1) | JP6013613B2 (en) |
CN (1) | CN104662547A (en) |
WO (1) | WO2014063124A1 (en) |
Cited By (52)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150169877A1 (en) * | 2012-06-05 | 2015-06-18 | Lookout, Inc. | Monitoring for fraudulent or harmful behavior in applications being installed on user devices |
US20150212826A1 (en) * | 2014-01-28 | 2015-07-30 | Nec Corporation | Information processing apparatus, information processing method, and storage medium |
US9245123B1 (en) | 2014-05-07 | 2016-01-26 | Symantec Corporation | Systems and methods for identifying malicious files |
US20160112451A1 (en) * | 2014-10-21 | 2016-04-21 | Proofpoint, Inc. | Systems and methods for application security analysis |
US20160191645A1 (en) * | 2014-12-30 | 2016-06-30 | Citrix Systems, Inc. | Containerizing Web Applications for Managed Execution |
US20160232353A1 (en) * | 2015-02-09 | 2016-08-11 | Qualcomm Incorporated | Determining Model Protection Level On-Device based on Malware Detection in Similar Devices |
US20160299977A1 (en) * | 2015-04-13 | 2016-10-13 | Quixey, Inc. | Action-Based App Recommendation Engine |
US20170053032A1 (en) * | 2015-08-17 | 2017-02-23 | Accenture Global Solutions Limited | Recommendation engine for aggregated platform data |
US9589129B2 (en) | 2012-06-05 | 2017-03-07 | Lookout, Inc. | Determining source of side-loaded software |
EP3145151A1 (en) * | 2015-09-18 | 2017-03-22 | Xiaomi Inc. | Short message service reading method and device |
US9652615B1 (en) | 2014-06-25 | 2017-05-16 | Symantec Corporation | Systems and methods for analyzing suspected malware |
US20170142156A1 (en) * | 2015-11-12 | 2017-05-18 | Toyota Infotechnology Center Usa, Inc. | Application Assurance for Open Platform In-Vehicle Infotainment System |
WO2017205802A1 (en) * | 2016-05-27 | 2017-11-30 | App Annie Inc. | Advertisement data metric determination within mobile applications |
US9882918B1 (en) * | 2017-05-15 | 2018-01-30 | Forcepoint, LLC | User behavior profile in a blockchain |
WO2018022702A1 (en) * | 2016-07-27 | 2018-02-01 | Intuit Inc. | Method and system for identifying and addressing potential account takeover activity in a financial system |
US20180039774A1 (en) * | 2016-08-08 | 2018-02-08 | International Business Machines Corporation | Install-Time Security Analysis of Mobile Applications |
US9912698B1 (en) * | 2013-03-13 | 2018-03-06 | Fireeye, Inc. | Malicious content analysis using simulated user interaction without user involvement |
US10019569B2 (en) * | 2014-06-27 | 2018-07-10 | Qualcomm Incorporated | Dynamic patching for diversity-based software security |
US10021543B2 (en) | 2015-09-18 | 2018-07-10 | Xiaomi Inc. | Short message service reading method and device |
US10027629B2 (en) | 2015-09-18 | 2018-07-17 | Xiaomi Inc. | Short message service reading method and device |
US10083452B1 (en) | 2016-06-21 | 2018-09-25 | Intuit Inc. | Method and system for identifying potentially fraudulent bill and invoice payments |
US10114950B2 (en) | 2012-10-19 | 2018-10-30 | McAFEE, LLC. | Mobile application management |
US10129269B1 (en) | 2017-05-15 | 2018-11-13 | Forcepoint, LLC | Managing blockchain access to user profile information |
US10218697B2 (en) | 2017-06-09 | 2019-02-26 | Lookout, Inc. | Use of device risk evaluation to manage access to services |
US10262153B2 (en) | 2017-07-26 | 2019-04-16 | Forcepoint, LLC | Privacy protection during insider threat monitoring |
US10270769B2 (en) | 2014-10-31 | 2019-04-23 | Proofpoint, Inc. | Privately performing application security analysis |
US10373140B1 (en) | 2015-10-26 | 2019-08-06 | Intuit Inc. | Method and system for detecting fraudulent bill payment transactions using dynamic multi-parameter predictive modeling |
US10447718B2 (en) | 2017-05-15 | 2019-10-15 | Forcepoint Llc | User profile definition and management |
US10489224B1 (en) | 2018-07-30 | 2019-11-26 | International Business Machines Corporation | Managing application programming interface requests |
US20200014722A1 (en) * | 2018-07-06 | 2020-01-09 | Capital One Services, Llc | Automated honeypot creation within a network |
US10607021B2 (en) * | 2018-01-26 | 2020-03-31 | Bank Of America Corporation | Monitoring usage of an application to identify characteristics and trigger security control |
US10623431B2 (en) | 2017-05-15 | 2020-04-14 | Forcepoint Llc | Discerning psychological state from correlated user behavior and contextual information |
US10754944B2 (en) | 2016-01-27 | 2020-08-25 | Yuta TAKEDA | Processing system, and processing method and program |
US10826933B1 (en) * | 2016-03-31 | 2020-11-03 | Fireeye, Inc. | Technique for verifying exploit/malware at malware detection appliance through correlation with endpoints |
US10853496B2 (en) | 2019-04-26 | 2020-12-01 | Forcepoint, LLC | Adaptive trust profile behavioral fingerprint |
US10862927B2 (en) | 2017-05-15 | 2020-12-08 | Forcepoint, LLC | Dividing events into sessions during adaptive trust profile operations |
US10893059B1 (en) | 2016-03-31 | 2021-01-12 | Fireeye, Inc. | Verification and enhancement using detection systems located at the network periphery and endpoint devices |
US10915644B2 (en) | 2017-05-15 | 2021-02-09 | Forcepoint, LLC | Collecting data for centralized use in an adaptive trust profile event via an endpoint |
US10917423B2 (en) | 2017-05-15 | 2021-02-09 | Forcepoint, LLC | Intelligently differentiating between different types of states and attributes when using an adaptive trust profile |
US10970392B2 (en) * | 2014-06-26 | 2021-04-06 | Palo Alto Networks, Inc. | Grouping application components for classification and malware detection |
US10999296B2 (en) | 2017-05-15 | 2021-05-04 | Forcepoint, LLC | Generating adaptive trust profiles using information derived from similarly situated organizations |
US10999297B2 (en) | 2017-05-15 | 2021-05-04 | Forcepoint, LLC | Using expected behavior of an entity when prepopulating an adaptive trust profile |
US11048390B2 (en) * | 2018-06-25 | 2021-06-29 | MI Technical Solutions, Inc. | Auto-reformatting of home screen graphical user interface depicting only administrator-approved applications |
US11087334B1 (en) | 2017-04-04 | 2021-08-10 | Intuit Inc. | Method and system for identifying potential fraud activity in a tax return preparation system, at least partially based on data entry characteristics of tax return content |
US11194559B2 (en) * | 2019-08-06 | 2021-12-07 | Saudi Arabian Oil Company | Method and apparatus for platform as a service (PaaS) automation control |
US11233779B2 (en) * | 2018-06-03 | 2022-01-25 | Apple Inc. | Wireless credential sharing |
US11259183B2 (en) | 2015-05-01 | 2022-02-22 | Lookout, Inc. | Determining a security state designation for a computing device based on a source of software |
US20220084327A1 (en) * | 2019-06-19 | 2022-03-17 | Autel Intelligent Technology Corp., Ltd. | Automobile diagnosis method, apparatus and system |
US11474978B2 (en) | 2018-07-06 | 2022-10-18 | Capital One Services, Llc | Systems and methods for a data search engine based on data profiles |
US11681809B2 (en) * | 2018-04-19 | 2023-06-20 | Canon Kabushiki Kaisha | Information processing apparatus, control method, and storage medium |
US11741196B2 (en) | 2018-11-15 | 2023-08-29 | The Research Foundation For The State University Of New York | Detecting and preventing exploits of software vulnerability using instruction tags |
US11829866B1 (en) | 2017-12-27 | 2023-11-28 | Intuit Inc. | System and method for hierarchical deep semi-supervised embeddings for dynamic targeted anomaly detection |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9807195B2 (en) * | 2014-02-12 | 2017-10-31 | Mobile Heartbeat, Llc | System for setting and controlling functionalities of mobile devices |
JP6042371B2 (en) * | 2014-05-19 | 2016-12-14 | 株式会社オプティム | Terminal data management server, terminal data management method, and program for terminal data management server |
US9332385B1 (en) | 2015-02-13 | 2016-05-03 | International Business Machines Corporation | Selectively providing content to users located within a virtual perimeter |
CN105184118B (en) * | 2015-08-31 | 2018-02-23 | 西北大学 | A kind of Android application program shell adding guard methods and device based on code fragmentation |
US9501274B1 (en) | 2016-01-29 | 2016-11-22 | International Business Machines Corporation | Qualitative feedback correlator |
US10592676B2 (en) | 2016-10-28 | 2020-03-17 | Tala Security, Inc. | Application security service |
US20190294780A1 (en) * | 2018-03-26 | 2019-09-26 | International Business Machines Corporation | System and method for executing operating system level virtualization software objects |
CN108763908B (en) * | 2018-06-01 | 2023-04-18 | 腾讯科技(深圳)有限公司 | Behavior vector generation method, device, terminal and storage medium |
CN109002731A (en) * | 2018-07-31 | 2018-12-14 | 佛山长意云信息技术有限公司 | A kind of social software management method, device, computer equipment and storage medium |
CN109241734A (en) * | 2018-08-10 | 2019-01-18 | 航天信息股份有限公司 | A kind of securing software operational efficiency optimization method and system |
CN109189221B (en) * | 2018-08-23 | 2021-07-16 | 郑州航空工业管理学院 | User behavior identification method across mobile phone platforms |
KR102510846B1 (en) | 2018-10-04 | 2023-03-16 | 삼성전자주식회사 | Electronic apparatus and controlling method thereof |
EP3973427A4 (en) * | 2019-05-20 | 2023-06-21 | Sentinel Labs Israel Ltd. | Systems and methods for executable code detection, automatic feature extraction and position independent code detection |
KR20210115520A (en) * | 2020-03-13 | 2021-09-27 | 현대자동차주식회사 | Vehicle and control method for the same |
TWI765690B (en) * | 2021-04-30 | 2022-05-21 | 精品科技股份有限公司 | Method of application control based on observation mode |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080295069A1 (en) * | 2007-05-21 | 2008-11-27 | International Business Machines Corporation | User-extensible rule-based source code modification |
US20130263206A1 (en) * | 2012-03-30 | 2013-10-03 | Nokia Corporation | Method and apparatus for policy adaption based on application policy compliance analysis |
US20130283262A1 (en) * | 2010-12-17 | 2013-10-24 | Intellipocket Oy | Providing a customized application to a user terminal |
US9104870B1 (en) * | 2012-09-28 | 2015-08-11 | Palo Alto Networks, Inc. | Detecting malware |
Family Cites Families (42)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6073142A (en) | 1997-06-23 | 2000-06-06 | Park City Group | Automated post office based rule analysis of e-mail messages and other data objects for controlled distribution in network environments |
US5987610A (en) | 1998-02-12 | 1999-11-16 | Ameritech Corporation | Computer virus screening methods and systems |
US6460050B1 (en) | 1999-12-22 | 2002-10-01 | Mark Raymond Pace | Distributed content identification system |
US6901519B1 (en) | 2000-06-22 | 2005-05-31 | Infobahn, Inc. | E-mail virus protection system and method |
US7536472B2 (en) * | 2001-09-13 | 2009-05-19 | Network Foundation Technologies, Llc | Systems for distributing data over a computer network and methods for arranging nodes for distribution of data over a computer network |
EP1329787B1 (en) | 2002-01-16 | 2019-08-28 | Texas Instruments Incorporated | Secure mode indicator for smart phone or PDA |
GB2383444B (en) * | 2002-05-08 | 2003-12-03 | Gfi Software Ltd | System and method for detecting a potentially malicious executable file |
US20040088588A1 (en) * | 2002-10-31 | 2004-05-06 | International Business Machines Corporation | Limited resource access while power-on-password is active |
US7437675B2 (en) | 2003-02-03 | 2008-10-14 | Hewlett-Packard Development Company, L.P. | System and method for monitoring event based systems |
JP4164036B2 (en) * | 2004-02-05 | 2008-10-08 | トレンドマイクロ株式会社 | Ensuring security on the receiving device for programs provided via the network |
JP4583808B2 (en) * | 2004-05-17 | 2010-11-17 | パナソニック株式会社 | Program execution control device and program execution control method |
US20060015940A1 (en) * | 2004-07-14 | 2006-01-19 | Shay Zamir | Method for detecting unwanted executables |
ATE393424T1 (en) * | 2004-10-06 | 2008-05-15 | Nokia Corp | ACCESS CONTROL FOR COMMUNICATIONS TERMINAL |
US20090328185A1 (en) * | 2004-11-04 | 2009-12-31 | Eric Van Den Berg | Detecting exploit code in network flows |
US7549144B2 (en) | 2005-02-22 | 2009-06-16 | Microsoft Corporation | Custom API modeling for source code static analysis simulator |
US7818800B1 (en) | 2005-08-05 | 2010-10-19 | Symantec Corporation | Method, system, and computer program product for blocking malicious program behaviors |
CN101243400B (en) * | 2005-08-16 | 2015-03-25 | Emc公司 | Information protection method and system |
US8413209B2 (en) * | 2006-03-27 | 2013-04-02 | Telecom Italia S.P.A. | System for enforcing security policies on mobile communications devices |
US20080083031A1 (en) * | 2006-12-20 | 2008-04-03 | Microsoft Corporation | Secure service computation |
US8407675B1 (en) | 2007-02-06 | 2013-03-26 | The United States Of America As Represented By The Secretary Of The Navy | Extraction of executable code and translation to alternate platform |
CN100461132C (en) * | 2007-03-02 | 2009-02-11 | 北京邮电大学 | Software safety code analyzer based on static analysis of source code and testing method therefor |
US7825820B2 (en) * | 2007-09-28 | 2010-11-02 | Apple Inc. | Security using electronic devices |
JP2009130856A (en) * | 2007-11-27 | 2009-06-11 | Nec Corp | Mobile terminal, application execution method, computer program, and system |
CN101266550B (en) * | 2007-12-21 | 2011-02-16 | 北京大学 | Malicious code detection method |
JP5577527B2 (en) * | 2008-06-25 | 2014-08-27 | 日本電気株式会社 | Information processing system, personal information device, and access management method |
US8347386B2 (en) * | 2008-10-21 | 2013-01-01 | Lookout, Inc. | System and method for server-coupled malware prevention |
JP5440973B2 (en) * | 2009-02-23 | 2014-03-12 | 独立行政法人情報通信研究機構 | Computer inspection system and computer inspection method |
JP5302711B2 (en) * | 2009-02-25 | 2013-10-02 | 日本電信電話株式会社 | Client terminal, network connection control method, terminal management processor, program |
JP2010257150A (en) * | 2009-04-23 | 2010-11-11 | Ntt Docomo Inc | Device and method for detection of fraudulence processing, and program |
CN102404510B (en) * | 2009-06-16 | 2015-07-01 | 英特尔公司 | Camera applications in handheld device |
US20100328032A1 (en) * | 2009-06-24 | 2010-12-30 | Broadcom Corporation | Security for computing unit with femtocell ap functionality |
KR101632203B1 (en) * | 2010-03-17 | 2016-06-22 | 삼성전자주식회사 | Method and apparatus for executing application of mobile terminal |
WO2012089898A1 (en) * | 2010-12-27 | 2012-07-05 | Nokia Corporation | Method and apparatus for providing input suggestions |
US9003544B2 (en) | 2011-07-26 | 2015-04-07 | Kaspersky Lab Zao | Efficient securing of data on mobile devices |
US9063964B2 (en) | 2012-01-04 | 2015-06-23 | Trustgo Mobile, Inc. | Detecting application harmful behavior and grading application risks for mobile devices |
US9369589B2 (en) * | 2012-01-27 | 2016-06-14 | Microsoft Technology Licensing, Llc | Updating dynamic data usage plans and statistics |
US9325806B2 (en) * | 2012-02-24 | 2016-04-26 | Qualcomm Incorporated | Cooperative loading of webpages based on shared meta information |
US9202047B2 (en) * | 2012-05-14 | 2015-12-01 | Qualcomm Incorporated | System, apparatus, and method for adaptive observation of mobile device behavior |
US9230076B2 (en) * | 2012-08-30 | 2016-01-05 | Microsoft Technology Licensing, Llc | Mobile device child share |
US8819855B2 (en) * | 2012-09-10 | 2014-08-26 | Mdi Security, Llc | System and method for deploying handheld devices to secure an area |
WO2014063124A1 (en) | 2012-10-19 | 2014-04-24 | Mcafee, Inc. | Mobile application management |
US9298361B2 (en) * | 2013-03-15 | 2016-03-29 | Apple Inc. | Analyzing applications for different access modes |
-
2013
- 2013-10-18 WO PCT/US2013/065799 patent/WO2014063124A1/en active Application Filing
- 2013-10-18 EP EP13848027.2A patent/EP2909775B1/en active Active
- 2013-10-18 JP JP2015534833A patent/JP6013613B2/en active Active
- 2013-10-18 US US14/126,866 patent/US20150220734A1/en not_active Abandoned
- 2013-10-18 CN CN201380048869.7A patent/CN104662547A/en active Pending
-
2017
- 2017-07-18 US US15/652,623 patent/US10114950B2/en active Active
-
2018
- 2018-09-21 US US16/138,904 patent/US11157616B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080295069A1 (en) * | 2007-05-21 | 2008-11-27 | International Business Machines Corporation | User-extensible rule-based source code modification |
US20130283262A1 (en) * | 2010-12-17 | 2013-10-24 | Intellipocket Oy | Providing a customized application to a user terminal |
US20130263206A1 (en) * | 2012-03-30 | 2013-10-03 | Nokia Corporation | Method and apparatus for policy adaption based on application policy compliance analysis |
US9104870B1 (en) * | 2012-09-28 | 2015-08-11 | Palo Alto Networks, Inc. | Detecting malware |
Cited By (129)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9589129B2 (en) | 2012-06-05 | 2017-03-07 | Lookout, Inc. | Determining source of side-loaded software |
US10419222B2 (en) * | 2012-06-05 | 2019-09-17 | Lookout, Inc. | Monitoring for fraudulent or harmful behavior in applications being installed on user devices |
US9940454B2 (en) | 2012-06-05 | 2018-04-10 | Lookout, Inc. | Determining source of side-loaded software using signature of authorship |
US20150169877A1 (en) * | 2012-06-05 | 2015-06-18 | Lookout, Inc. | Monitoring for fraudulent or harmful behavior in applications being installed on user devices |
US11336458B2 (en) * | 2012-06-05 | 2022-05-17 | Lookout, Inc. | Evaluating authenticity of applications based on assessing user device context for increased security |
US9407443B2 (en) | 2012-06-05 | 2016-08-02 | Lookout, Inc. | Component analysis of software applications on computing devices |
US10256979B2 (en) | 2012-06-05 | 2019-04-09 | Lookout, Inc. | Assessing application authenticity and performing an action in response to an evaluation result |
US9992025B2 (en) | 2012-06-05 | 2018-06-05 | Lookout, Inc. | Monitoring installed applications on user devices |
US10114950B2 (en) | 2012-10-19 | 2018-10-30 | McAFEE, LLC. | Mobile application management |
US11157616B2 (en) * | 2012-10-19 | 2021-10-26 | Mcafee, Llc | Mobile application management |
US10848521B1 (en) * | 2013-03-13 | 2020-11-24 | Fireeye, Inc. | Malicious content analysis using simulated user interaction without user involvement |
US9912698B1 (en) * | 2013-03-13 | 2018-03-06 | Fireeye, Inc. | Malicious content analysis using simulated user interaction without user involvement |
US20150212826A1 (en) * | 2014-01-28 | 2015-07-30 | Nec Corporation | Information processing apparatus, information processing method, and storage medium |
US9858085B2 (en) * | 2014-01-28 | 2018-01-02 | Nec Corporation | Information processing including BIOS apparatus, information processing method thereof, and storage medium |
US9846772B1 (en) * | 2014-05-07 | 2017-12-19 | Symantec Corporation | Systems and methods for detecting misplaced applications using functional categories |
US9245123B1 (en) | 2014-05-07 | 2016-01-26 | Symantec Corporation | Systems and methods for identifying malicious files |
US9571509B1 (en) | 2014-05-07 | 2017-02-14 | Symantec Corporation | Systems and methods for identifying variants of samples based on similarity analysis |
US9652615B1 (en) | 2014-06-25 | 2017-05-16 | Symantec Corporation | Systems and methods for analyzing suspected malware |
US10970392B2 (en) * | 2014-06-26 | 2021-04-06 | Palo Alto Networks, Inc. | Grouping application components for classification and malware detection |
US10019569B2 (en) * | 2014-06-27 | 2018-07-10 | Qualcomm Incorporated | Dynamic patching for diversity-based software security |
US9967278B2 (en) * | 2014-10-21 | 2018-05-08 | Proofpoint, Inc. | Systems and methods for application security analysis |
US10623435B2 (en) | 2014-10-21 | 2020-04-14 | Proofpoint, Inc. | Application security analysis |
US11336678B2 (en) | 2014-10-21 | 2022-05-17 | Proofpoint, Inc. | Methods and systems for security analysis of applications on mobile devices brought into an enterprise network environment |
US10097576B2 (en) * | 2014-10-21 | 2018-10-09 | Proofpoint, Inc. | Systems and methods for application security analysis |
US20160112451A1 (en) * | 2014-10-21 | 2016-04-21 | Proofpoint, Inc. | Systems and methods for application security analysis |
US10505933B2 (en) * | 2014-10-31 | 2019-12-10 | Proofpoint, Inc. | Systems and methods for security analysis of applications on user mobile devices while maintaining user application privacy |
US11032711B2 (en) | 2014-10-31 | 2021-06-08 | Proofpoint, Inc. | Systems and methods for security analysis of applications on user mobile devices while maintaining user application privacy |
US10270769B2 (en) | 2014-10-31 | 2019-04-23 | Proofpoint, Inc. | Privately performing application security analysis |
US20190182247A1 (en) * | 2014-10-31 | 2019-06-13 | Proofpoint, Inc. | Systems and Methods for Security Analysis of Applications on User Mobile Devices While Maintaining User Application Privacy |
US11540133B2 (en) | 2014-10-31 | 2022-12-27 | Proofpoint, Inc. | Systems and methods for security analysis of applications on user mobile devices while maintaining user application privacy |
US20160191645A1 (en) * | 2014-12-30 | 2016-06-30 | Citrix Systems, Inc. | Containerizing Web Applications for Managed Execution |
US20160232353A1 (en) * | 2015-02-09 | 2016-08-11 | Qualcomm Incorporated | Determining Model Protection Level On-Device based on Malware Detection in Similar Devices |
US20160299977A1 (en) * | 2015-04-13 | 2016-10-13 | Quixey, Inc. | Action-Based App Recommendation Engine |
US11259183B2 (en) | 2015-05-01 | 2022-02-22 | Lookout, Inc. | Determining a security state designation for a computing device based on a source of software |
US12120519B2 (en) | 2015-05-01 | 2024-10-15 | Lookout, Inc. | Determining a security state based on communication with an authenticity server |
US20170053032A1 (en) * | 2015-08-17 | 2017-02-23 | Accenture Global Solutions Limited | Recommendation engine for aggregated platform data |
US10599679B2 (en) | 2015-08-17 | 2020-03-24 | Accenture Global Solutions Limited | Platform data aggregation and semantic modeling |
US10776398B2 (en) | 2015-08-17 | 2020-09-15 | Accenture Global Solutions Limited | Platform data lifecycle management |
US10509806B2 (en) * | 2015-08-17 | 2019-12-17 | Accenture Global Solutions Limited | Recommendation engine for aggregated platform data |
US10021543B2 (en) | 2015-09-18 | 2018-07-10 | Xiaomi Inc. | Short message service reading method and device |
EP3145151A1 (en) * | 2015-09-18 | 2017-03-22 | Xiaomi Inc. | Short message service reading method and device |
US10027629B2 (en) | 2015-09-18 | 2018-07-17 | Xiaomi Inc. | Short message service reading method and device |
US9998887B2 (en) | 2015-09-18 | 2018-06-12 | Xiaomi Inc. | Short message service reading method and device |
US10373140B1 (en) | 2015-10-26 | 2019-08-06 | Intuit Inc. | Method and system for detecting fraudulent bill payment transactions using dynamic multi-parameter predictive modeling |
US20170142156A1 (en) * | 2015-11-12 | 2017-05-18 | Toyota Infotechnology Center Usa, Inc. | Application Assurance for Open Platform In-Vehicle Infotainment System |
US10754944B2 (en) | 2016-01-27 | 2020-08-25 | Yuta TAKEDA | Processing system, and processing method and program |
US11936666B1 (en) | 2016-03-31 | 2024-03-19 | Musarubra Us Llc | Risk analyzer for ascertaining a risk of harm to a network and generating alerts regarding the ascertained risk |
US11979428B1 (en) | 2016-03-31 | 2024-05-07 | Musarubra Us Llc | Technique for verifying exploit/malware at malware detection appliance through correlation with endpoints |
US10893059B1 (en) | 2016-03-31 | 2021-01-12 | Fireeye, Inc. | Verification and enhancement using detection systems located at the network periphery and endpoint devices |
US10826933B1 (en) * | 2016-03-31 | 2020-11-03 | Fireeye, Inc. | Technique for verifying exploit/malware at malware detection appliance through correlation with endpoints |
WO2017205802A1 (en) * | 2016-05-27 | 2017-11-30 | App Annie Inc. | Advertisement data metric determination within mobile applications |
US10083452B1 (en) | 2016-06-21 | 2018-09-25 | Intuit Inc. | Method and system for identifying potentially fraudulent bill and invoice payments |
WO2018022702A1 (en) * | 2016-07-27 | 2018-02-01 | Intuit Inc. | Method and system for identifying and addressing potential account takeover activity in a financial system |
US20180039774A1 (en) * | 2016-08-08 | 2018-02-08 | International Business Machines Corporation | Install-Time Security Analysis of Mobile Applications |
US10621333B2 (en) * | 2016-08-08 | 2020-04-14 | International Business Machines Corporation | Install-time security analysis of mobile applications |
US11087334B1 (en) | 2017-04-04 | 2021-08-10 | Intuit Inc. | Method and system for identifying potential fraud activity in a tax return preparation system, at least partially based on data entry characteristics of tax return content |
US10129269B1 (en) | 2017-05-15 | 2018-11-13 | Forcepoint, LLC | Managing blockchain access to user profile information |
US11025646B2 (en) | 2017-05-15 | 2021-06-01 | Forcepoint, LLC | Risk adaptive protection |
US9882918B1 (en) * | 2017-05-15 | 2018-01-30 | Forcepoint, LLC | User behavior profile in a blockchain |
US10063568B1 (en) | 2017-05-15 | 2018-08-28 | Forcepoint Llc | User behavior profile in a blockchain |
US10645096B2 (en) | 2017-05-15 | 2020-05-05 | Forcepoint Llc | User behavior profile environment |
US10171488B2 (en) | 2017-05-15 | 2019-01-01 | Forcepoint, LLC | User behavior profile |
US11757902B2 (en) | 2017-05-15 | 2023-09-12 | Forcepoint Llc | Adaptive trust profile reference architecture |
US11575685B2 (en) | 2017-05-15 | 2023-02-07 | Forcepoint Llc | User behavior profile including temporal detail corresponding to user interaction |
US10798109B2 (en) | 2017-05-15 | 2020-10-06 | Forcepoint Llc | Adaptive trust profile reference architecture |
US10542013B2 (en) | 2017-05-15 | 2020-01-21 | Forcepoint Llc | User behavior profile in a blockchain |
US10834097B2 (en) | 2017-05-15 | 2020-11-10 | Forcepoint, LLC | Adaptive trust profile components |
US10834098B2 (en) | 2017-05-15 | 2020-11-10 | Forcepoint, LLC | Using a story when generating inferences using an adaptive trust profile |
US10623431B2 (en) | 2017-05-15 | 2020-04-14 | Forcepoint Llc | Discerning psychological state from correlated user behavior and contextual information |
US10855693B2 (en) | 2017-05-15 | 2020-12-01 | Forcepoint, LLC | Using an adaptive trust profile to generate inferences |
US10855692B2 (en) | 2017-05-15 | 2020-12-01 | Forcepoint, LLC | Adaptive trust profile endpoint |
US11463453B2 (en) | 2017-05-15 | 2022-10-04 | Forcepoint, LLC | Using a story when generating inferences using an adaptive trust profile |
US10862901B2 (en) | 2017-05-15 | 2020-12-08 | Forcepoint, LLC | User behavior profile including temporal detail corresponding to user interaction |
US10862927B2 (en) | 2017-05-15 | 2020-12-08 | Forcepoint, LLC | Dividing events into sessions during adaptive trust profile operations |
US10264012B2 (en) | 2017-05-15 | 2019-04-16 | Forcepoint, LLC | User behavior profile |
US10298609B2 (en) | 2017-05-15 | 2019-05-21 | Forcepoint, LLC | User behavior profile environment |
US10530786B2 (en) | 2017-05-15 | 2020-01-07 | Forcepoint Llc | Managing access to user profile information via a distributed transaction database |
US10915644B2 (en) | 2017-05-15 | 2021-02-09 | Forcepoint, LLC | Collecting data for centralized use in an adaptive trust profile event via an endpoint |
US10917423B2 (en) | 2017-05-15 | 2021-02-09 | Forcepoint, LLC | Intelligently differentiating between different types of states and attributes when using an adaptive trust profile |
US10915643B2 (en) | 2017-05-15 | 2021-02-09 | Forcepoint, LLC | Adaptive trust profile endpoint architecture |
US10943019B2 (en) | 2017-05-15 | 2021-03-09 | Forcepoint, LLC | Adaptive trust profile endpoint |
US10944762B2 (en) | 2017-05-15 | 2021-03-09 | Forcepoint, LLC | Managing blockchain access to user information |
US10326775B2 (en) | 2017-05-15 | 2019-06-18 | Forcepoint, LLC | Multi-factor authentication using a user behavior profile as a factor |
US10326776B2 (en) | 2017-05-15 | 2019-06-18 | Forcepoint, LLC | User behavior profile including temporal detail corresponding to user interaction |
US11082440B2 (en) | 2017-05-15 | 2021-08-03 | Forcepoint Llc | User profile definition and management |
US10447718B2 (en) | 2017-05-15 | 2019-10-15 | Forcepoint Llc | User profile definition and management |
US10999296B2 (en) | 2017-05-15 | 2021-05-04 | Forcepoint, LLC | Generating adaptive trust profiles using information derived from similarly situated organizations |
US10999297B2 (en) | 2017-05-15 | 2021-05-04 | Forcepoint, LLC | Using expected behavior of an entity when prepopulating an adaptive trust profile |
US10218697B2 (en) | 2017-06-09 | 2019-02-26 | Lookout, Inc. | Use of device risk evaluation to manage access to services |
US11038876B2 (en) | 2017-06-09 | 2021-06-15 | Lookout, Inc. | Managing access to services based on fingerprint matching |
US12081540B2 (en) | 2017-06-09 | 2024-09-03 | Lookout, Inc. | Configuring access to a network service based on a security state of a mobile device |
US10733323B2 (en) | 2017-07-26 | 2020-08-04 | Forcepoint Llc | Privacy protection during insider threat monitoring |
US10262153B2 (en) | 2017-07-26 | 2019-04-16 | Forcepoint, LLC | Privacy protection during insider threat monitoring |
US11829866B1 (en) | 2017-12-27 | 2023-11-28 | Intuit Inc. | System and method for hierarchical deep semi-supervised embeddings for dynamic targeted anomaly detection |
US10607021B2 (en) * | 2018-01-26 | 2020-03-31 | Bank Of America Corporation | Monitoring usage of an application to identify characteristics and trigger security control |
US11151272B2 (en) | 2018-01-26 | 2021-10-19 | Bank Of America Corporation | Monitoring usage of an application to identify characteristics and trigger security control |
US11681809B2 (en) * | 2018-04-19 | 2023-06-20 | Canon Kabushiki Kaisha | Information processing apparatus, control method, and storage medium |
US11233779B2 (en) * | 2018-06-03 | 2022-01-25 | Apple Inc. | Wireless credential sharing |
US11048390B2 (en) * | 2018-06-25 | 2021-06-29 | MI Technical Solutions, Inc. | Auto-reformatting of home screen graphical user interface depicting only administrator-approved applications |
US11474978B2 (en) | 2018-07-06 | 2022-10-18 | Capital One Services, Llc | Systems and methods for a data search engine based on data profiles |
US11580261B2 (en) * | 2018-07-06 | 2023-02-14 | Capital One Services, Llc | Automated honeypot creation within a network |
US11237884B2 (en) * | 2018-07-06 | 2022-02-01 | Capital One Services, Llc | Automated honeypot creation within a network |
US10884894B2 (en) | 2018-07-06 | 2021-01-05 | Capital One Services, Llc | Systems and methods for synthetic data generation for time-series data using data segments |
US10983841B2 (en) | 2018-07-06 | 2021-04-20 | Capital One Services, Llc | Systems and methods for removing identifiable information |
US20220107851A1 (en) * | 2018-07-06 | 2022-04-07 | Capital One Services, Llc | Automated honeypot creation within a network |
US10860460B2 (en) * | 2018-07-06 | 2020-12-08 | Capital One Services, Llc | Automated honeypot creation within a network |
US12093753B2 (en) | 2018-07-06 | 2024-09-17 | Capital One Services, Llc | Method and system for synthetic generation of time series data |
US11385942B2 (en) | 2018-07-06 | 2022-07-12 | Capital One Services, Llc | Systems and methods for censoring text inline |
US10970137B2 (en) | 2018-07-06 | 2021-04-06 | Capital One Services, Llc | Systems and methods to identify breaking application program interface changes |
US10599957B2 (en) | 2018-07-06 | 2020-03-24 | Capital One Services, Llc | Systems and methods for detecting data drift for data used in machine learning models |
US11513869B2 (en) | 2018-07-06 | 2022-11-29 | Capital One Services, Llc | Systems and methods for synthetic database query generation |
US20200014722A1 (en) * | 2018-07-06 | 2020-01-09 | Capital One Services, Llc | Automated honeypot creation within a network |
US10592386B2 (en) | 2018-07-06 | 2020-03-17 | Capital One Services, Llc | Fully automated machine learning system which generates and optimizes solutions given a dataset and a desired outcome |
US11574077B2 (en) | 2018-07-06 | 2023-02-07 | Capital One Services, Llc | Systems and methods for removing identifiable information |
US11210145B2 (en) | 2018-07-06 | 2021-12-28 | Capital One Services, Llc | Systems and methods to manage application program interface communications |
US11615208B2 (en) | 2018-07-06 | 2023-03-28 | Capital One Services, Llc | Systems and methods for synthetic data generation |
US11126475B2 (en) | 2018-07-06 | 2021-09-21 | Capital One Services, Llc | Systems and methods to use neural networks to transform a model into a neural network model |
US11687384B2 (en) | 2018-07-06 | 2023-06-27 | Capital One Services, Llc | Real-time synthetically generated video from still frames |
US11704169B2 (en) | 2018-07-06 | 2023-07-18 | Capital One Services, Llc | Data model generation using generative adversarial networks |
US11822975B2 (en) | 2018-07-06 | 2023-11-21 | Capital One Services, Llc | Systems and methods for synthetic data generation for time-series data using data segments |
US10599550B2 (en) | 2018-07-06 | 2020-03-24 | Capital One Services, Llc | Systems and methods to identify breaking application program interface changes |
US10489224B1 (en) | 2018-07-30 | 2019-11-26 | International Business Machines Corporation | Managing application programming interface requests |
US11741196B2 (en) | 2018-11-15 | 2023-08-29 | The Research Foundation For The State University Of New York | Detecting and preventing exploits of software vulnerability using instruction tags |
US12061677B2 (en) | 2018-11-15 | 2024-08-13 | The Research Foundation For The State University Of New York | Secure processor for detecting and preventing exploits of software vulnerability |
US11163884B2 (en) | 2019-04-26 | 2021-11-02 | Forcepoint Llc | Privacy and the adaptive trust profile |
US10997295B2 (en) | 2019-04-26 | 2021-05-04 | Forcepoint, LLC | Adaptive trust profile reference architecture |
US10853496B2 (en) | 2019-04-26 | 2020-12-01 | Forcepoint, LLC | Adaptive trust profile behavioral fingerprint |
US20220084327A1 (en) * | 2019-06-19 | 2022-03-17 | Autel Intelligent Technology Corp., Ltd. | Automobile diagnosis method, apparatus and system |
US11194559B2 (en) * | 2019-08-06 | 2021-12-07 | Saudi Arabian Oil Company | Method and apparatus for platform as a service (PaaS) automation control |
Also Published As
Publication number | Publication date |
---|---|
EP2909775A4 (en) | 2016-06-08 |
US11157616B2 (en) | 2021-10-26 |
JP2015534690A (en) | 2015-12-03 |
WO2014063124A1 (en) | 2014-04-24 |
US10114950B2 (en) | 2018-10-30 |
JP6013613B2 (en) | 2016-10-25 |
CN104662547A (en) | 2015-05-27 |
EP2909775A1 (en) | 2015-08-26 |
US20180089431A1 (en) | 2018-03-29 |
EP2909775B1 (en) | 2022-01-26 |
US20190026464A1 (en) | 2019-01-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11157616B2 (en) | Mobile application management | |
US11405400B2 (en) | Hardening based on access capability exercise sufficiency | |
US11086984B2 (en) | Mobile device policy enforcement | |
AU2019347708B2 (en) | Systems and methods for consistent enforcement policy across different saas applications via embedded browser | |
KR102403480B1 (en) | Device policy manager | |
Agarwal et al. | ProtectMyPrivacy: detecting and mitigating privacy leaks on iOS devices using crowdsourcing | |
US10565378B1 (en) | Exploit of privilege detection framework | |
Miller et al. | iOS Hacker's Handbook | |
Teufl et al. | Malware detection by applying knowledge discovery processes to application metadata on the Android Market (Google Play) | |
US20140223543A1 (en) | Computing device including a port and a guest domain | |
US10609165B1 (en) | Systems and methods for gamification of SaaS applications | |
US10963583B1 (en) | Automatic detection and protection against file system privilege escalation and manipulation vulnerabilities | |
US11973796B2 (en) | Dangling domain detection and access mitigation | |
Mylonas et al. | On the feasibility of malware attacks in smartphone platforms | |
Bhuiyan et al. | API vulnerabilities: Current status and dependencies | |
Gupta et al. | A risk-driven model to minimize the effects of human factors on smart devices | |
CN106209746B (en) | Security service providing method and server | |
Ingale et al. | Security in android based smartphone | |
Jain | Android security: Permission based attacks | |
Nazar et al. | Rooting Android–Extending the ADB by an auto-connecting WiFi-accessible service | |
Hidhaya et al. | Supplementary event-listener injection attack in smart phones | |
Karthick et al. | Static analysis tool for identification of permission misuse by android applications | |
Gorbāns et al. | The Myths of and Solutions for Android OS Controlled and Secure Environment | |
US20230214533A1 (en) | Computer-implemented systems and methods for application identification and authentication | |
Nwobodo | Exploring Optimal Subsets of Statically Registered Broadcast Receivers and Permissions for the Prediction of Malicious Behavior in Android Applications |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE |
|
AS | Assignment |
Owner name: MCAFEE, LLC, CALIFORNIA Free format text: CHANGE OF NAME AND ENTITY CONVERSION;ASSIGNOR:MCAFEE, INC.;REEL/FRAME:043665/0918 Effective date: 20161220 |
|
AS | Assignment |
Owner name: MORGAN STANLEY SENIOR FUNDING, INC., MARYLAND Free format text: SECURITY INTEREST;ASSIGNOR:MCAFEE, LLC;REEL/FRAME:045056/0676 Effective date: 20170929 Owner name: JPMORGAN CHASE BANK, N.A., NEW YORK Free format text: SECURITY INTEREST;ASSIGNOR:MCAFEE, LLC;REEL/FRAME:045055/0786 Effective date: 20170929 |
|
AS | Assignment |
Owner name: MORGAN STANLEY SENIOR FUNDING, INC., MARYLAND Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE REMOVE PATENT 6336186 PREVIOUSLY RECORDED ON REEL 045056 FRAME 0676. ASSIGNOR(S) HEREBY CONFIRMS THE SECURITY INTEREST;ASSIGNOR:MCAFEE, LLC;REEL/FRAME:054206/0593 Effective date: 20170929 Owner name: JPMORGAN CHASE BANK, N.A., NEW YORK Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE REMOVE PATENT 6336186 PREVIOUSLY RECORDED ON REEL 045055 FRAME 786. ASSIGNOR(S) HEREBY CONFIRMS THE SECURITY INTEREST;ASSIGNOR:MCAFEE, LLC;REEL/FRAME:055854/0047 Effective date: 20170929 |
|
AS | Assignment |
Owner name: MCAFEE, LLC, CALIFORNIA Free format text: RELEASE OF INTELLECTUAL PROPERTY COLLATERAL - REEL/FRAME 045055/0786;ASSIGNOR:JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT;REEL/FRAME:054238/0001 Effective date: 20201026 |
|
AS | Assignment |
Owner name: MCAFEE, LLC, CALIFORNIA Free format text: RELEASE OF INTELLECTUAL PROPERTY COLLATERAL - REEL/FRAME 045056/0676;ASSIGNOR:MORGAN STANLEY SENIOR FUNDING, INC., AS COLLATERAL AGENT;REEL/FRAME:059354/0213 Effective date: 20220301 |