[go: nahoru, domu]

GB2446529A - Audio communication between networked games terminals - Google Patents

Audio communication between networked games terminals Download PDF

Info

Publication number
GB2446529A
GB2446529A GB0805290A GB0805290A GB2446529A GB 2446529 A GB2446529 A GB 2446529A GB 0805290 A GB0805290 A GB 0805290A GB 0805290 A GB0805290 A GB 0805290A GB 2446529 A GB2446529 A GB 2446529A
Authority
GB
United Kingdom
Prior art keywords
control terminal
games control
communication
game
games
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB0805290A
Other versions
GB2446529B (en
GB0805290D0 (en
Inventor
Gregory Duddle
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Europe Ltd
Original Assignee
Sony Computer Entertainment Europe Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Computer Entertainment Europe Ltd filed Critical Sony Computer Entertainment Europe Ltd
Priority to GB0805290A priority Critical patent/GB2446529B/en
Publication of GB0805290D0 publication Critical patent/GB0805290D0/en
Publication of GB2446529A publication Critical patent/GB2446529A/en
Application granted granted Critical
Publication of GB2446529B publication Critical patent/GB2446529B/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players
    • A63F13/87Communicating with other players during game play, e.g. by e-mail or chat
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/40Network security protocols
    • A63F13/12
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/57Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of game services offered to the player
    • A63F2300/572Communication between players during game play of non game information, e.g. e-mail, chat, file transfer, streaming of audio and streaming of video

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Telephonic Communication Services (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

A network comprises a plurality of processing apparatus, at least two of the processing apparatus being games control terminals, each games control terminal being operable to transmit or receive or render audio data to and from other games control terminals. The network provides controlling logic operable to determine whether the games control terminals may perform an audio-communication-task and logic operable to provide a game environment, an audio-communication-task being one or more of the transmission or receiving of audio data to or from one games control terminal to another games control terminal or the rendering of received audio data. The determination is dependent upon a game status associated with at least one of the games control terminals and/or at least one operator of a games control terminal. The game status comprises one or more associations made between communication-items found within the game environment and operators. Therefore the player whose character collects a communication item within a game environment may be the only player who can transmit voice data, such as speech, to the other players in the network. Other factors may influence the game status including position, within the game environment, distance between characters, and communication enabling actions.

Description

1 2446529
GAME PROCESSING
This is a divisional application from GBO4 14308.7.
This invention relates to electronic game processing.
Electronic games are well-known and may be categorised in many different ways, such as "racing games" (in which a player controls a vehicle around a course within the game environment); "fighting games" (in which a player controls a character within the game environment and combats other game characters); "maze games" (in which a player controls a character around a maze within the game environment); combinations of these types of game; etc.. In these games, a player may accumulate a score as the game is played; alternatively, a player's turn may be assessed by the level or position within the game that the player reaches. It is well-known for these games to provide special attributes to a player's game character if certain events occur. For example, in a fighting game there may be armour-objects distributed within the game environment (or they may appear from time to time) and if a game character collects one of these armour-objects then that character gains advantages in combat. As another example, the game may be arranged so that, if a game character reaches a certain level or is located at a certain position within the game environment then various events happen, such as extending time limits or "healing" the game character (i.e. increasing the value of a health property associated with the game character).
It is also well-known for electronic games to be so-called multi-player games, in which more that one human player is involved in the game. The players may collaborate in teams or may play as individuals against each other. Players may provide their input to a game by general hardware controllers (such as keyboards and mice) or by more specialised hardware controllers. Several controllers may be connected to a single electronic games machine (such as a personal computer or a dedicated games console) to facilitate multi-player games.
With developments in network technology and its associated bandwidth, it is now well-known to connect several games machines, such as personal computers or dedicated games consoles, via a communications network, such as a local area network (LAN), a wide area network (WAN), or the Internet. Several players may then participate in a game even when they are located at geographically different locations. Networks dedicated to the communication of game data (and designed to be suited to the communication of such data) have been developed. The networks used for such multi-player networked games may, in addition to the games machines, also comprise other network machines such as network servers.
It is known for such networked electronics games to allow the human players to communicate with each other over the network. This may be achieved, for example, by the players composing textual messages (for example, by typing on a keyboard) and sending these messages to other players over the network. It is also known for such networked electronic games to allow audio data (such as voice data) to be distributed across the network, the audio data being input from the player via a microphone, for example. Currently, this involves each player being able to talk to every other player who io is involved in the network game, or at least to a fixed subset of these players if, for example, the players are divided into teams and a player is only allowed to talk to other team members.
En a first aspect, this invention provides a network comprising: a plurality of processing apparatus, at least two of the processing apparatus being is games control terminals, each games control terminal being operable to transmit audio data to and receive audio data from another one of the games control terminals and to render received audio data; the network providing controlling logic operable to determine whether the games control terminals may perform an audio-communication-task, and logic operable to provide a game environment; an audio-communication-task being one or more of the transmission of audio data from a games control terminal to another games control terminal; the reception by a games control terminal of audio data transmitted from another games control terminal; and the rendering by a games control terminal of audio data received from another games control terminal; the determination being dependent upon a game status associated with at least one of the games control terminals and/or at least one operator of a games control terminal; and in which the game status comprises one or more associations made between communication-items found within the game environment and operators.
Further respective aspects and features of the invention are defined in the appended claims.
This invention recognises a problem with networked electronic games that allow audio data (and particularly voice data) to be communicated across the network. If S 3 several players participate in such a networked game and some or all of them talk over the network simultaneously, then it is very difficult for a player to discern who is talking to whom, and what is being said, as each player hears all of the audio data at once. This is in addition to any audio that is being generated and rendered locally to the player (i.e. not from the network). This will reduce the overall appeal of the electronic game, or, at the very least, will discourage players from participating in verbal communication across the network.
Accordingly, the invention provides a mechanism whereby the ability to talk to and/or listen to players across the network is either granted to or removed from a player depending on the current game status associated with one or more of the participating players. In this way, the number of players who can communicate audio data simultaneously over the network may be reduced to a more practical level, thus preventing too many players from being ableo talk simultaneously. A player is therefore able to hear more clearly the content of audio communication occurring over the network and, if permitted, may contribute to the audio communication in the knowledge that the contribution will be discernible by the other players.
Embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings in which: Figure 1 schematically illustrates the overall system architecture of the PlayStation2; Figure 2 schematically illustrates the architecture of an Emotion Engine; Figure 3 schematically illustrates the configuration of a Graphics Synthesiser; Figure 4 schematically illustrates four system units networked together; Figure 5 is a schematic flow chart of an exemplary embodiment for controlling audio communication in a networked game; Figure 6 is a schematic flow chart of a second embodiment for controlling audio communication in a networked game; Figure 7 is a schematic flow chart of a third embodiment for controlling audio communication in a networked game; Figure 8 is a schematic flow chart of a fourth embodiment for controlling audio communication in a networked game; and Figure 9 schematically illustrates an example positioning of game characters with in a game environment.
Figure 1 schematically illustrates the overall system architecture of the PlayStation2. A system unit 10 is provided, with various peripheral devices connectable to the system unit.
The system unit 10 comprises: an Emotion Engine 100; a Graphics Synthesiser 200; a sound processor unit 300 having dynamic random access memory (DRAM); a read only memory (ROM) 400; a compact disc (CD) and digital versatile disc (DVD) reader 450; a Rambus Dynamic Random Access Memory (RDRAM) unit 500; an input/output processor (lOP) 700 with dedicated RAM 750. An (optional) external hard disk drive (HDD) 390 may be connected.
The input/output processor 700 has two Universal Serial Bus (USB) ports 715 and an iLink or IEEE 1394 port (iLink is the Sony Corporation implementation of the IEEE 1394 standard). The lOP 700 handles all USB, iLink and game controller data traffic.
For example when a user is playing a game, the lOP 700 receives data from the game controller and directs it to the Emotion Engine 100 which updates the current state of the game accordingly. The lOP 700 has a Direct Memory Access (DMA) architecture to facilitate rapid data transfer rates. DMA involves transfer of data from main memory to a device without passing it through the CPU. The USB interface is compatible with Open Host Controller Interface (OHCI) and can handle data transfer rates of between 1.5 Mbps and 12 Mbps. Provision of these interfaces means that the PlayStation2 is potentially compatible with peripheral devices such as video cassette recorders (VCRs), digital cameras, microphones, set-top boxes, printers, keyboard, mouse and joystick.
Generally, in order for successful data communication to occur with a peripheral device connected to a USB port 715, an appropriate piece of software such as a device driver should be provided. Device driver technology is very well known and will not be described in detail here, except to say that the skilled man will be aware that a device driver or similar software interface may be required in the embodiment described here.
In the present embodiment, a USB microphone 730 is connected to the USB port.
It will be appreciated that the USB microphone 730 may be a hand-held microphone or may form part of a head-set that is worn by the human operator. The advantage of wearing a head-set is that the human operator's hand are free to perform other actions.
The microphone includes an analogue-to-digital converter (ADC) and a basic hardware-based real-time data compression and encoding arrangement, so that audio data are transmitted by the microphone 730 to the USB port 715 in an appropriate format, such as S 5 16-bit mono PCM (an uncompressed format) for decoding at the PlayStation 2 system unit 10.
Apart from the USB ports, two other ports 705, 710 are proprietary sockets allowing the connection of a proprietary non-volatile RAM memory card 720 for storing game-related information, a hand-held game controller 725 or a device (not shown) mimicking a hand-held controller, such as a dance mat.
The system unit 10 may be connected to a network adapter 805 that provides an interface (such as an Ethernet interface) to a network. This network may be, for example, a LAN, a WAN or the Internet. The network may be a general network or one that is dedicated to game related communication. The network adapter 805 allows data to be transmitted to and received from other system units 10 that are connected to the same network, (the other system units 10 also having corresponding network adapters 805).
The Emotion Engine 100 is a 128-bit Central Processing Unit (CPU) that has been specifically designed for efficient simulation of 3 dimensional (3D) graphics for games applications. The Emotion Engine components include a data bus, cache memory and registers, all of which are 128-bit. This facilitates fast processing of large volumes of multi-media data. Conventional PCs, by way of comparison, have a basic 64-bit data structure. The floating point calculation performance of the PlayStation2 is 6.2 GFLOPs.
The Emotion Engine also comprises MPEG2 decoder circuitry which allows for simultaneous processing of 3D graphics data and DVD data. The Emotion Engine performs geometrical calculations including mathematical transforms and translations and also performs calculations associated with the physics of simulation objects, for example, calculation of friction between two objects. It produces sequences of image rendering commands which are subsequently utilised by the Graphics Synthesiser 200. The image rendering commands are output in the form of display lists. A display list is a sequence of drawing commands that specifies to the Graphics Synthesiser which primitive graphic objects (e.g. points, lines, triangles, sprites) to draw on the screen and at which co-ordinates. Thus a typical display list will comprise commands to draw vertices, commands to shade the faces of polygons, render bitmaps and so on. The Emotion Engine 100 can asynchronously generate multiple display lists.
The Graphics Synthesiser 200 is a video accelerator that performs rendering of the display lists produced by the Emotion Engine 100. The Graphics Synthesiser 200 includes a graphics interface unit (GIF) which handles, tracks and manages the multiple display lists. The rendering function of the Graphics Synthesiser 200 can generate image data that supports several alternative standard output image formats, i.e., NTSC/PAL, High Definition Digital TV and VESA. In general, the rendering capability of graphics systems is defined by the memory bandwidth between a pixel engine and a video memory, each of which is located within the graphics processor. Conventional graphics systems use external Video Random Access Memory (VRAM) connected to the pixel logic via an off-chip bus which tends to restrict available bandwidth. However, the Graphics Synthesiser of the PlayStation2 provides the pixel logic and the video memory on a single high-performance chip which allows for a comparatively large 38.4 Gigabyte per second memory access bandwidth. The Graphics Synthesiser is theoretically capable of achieving a peak drawing capacity of 75 million polygons per second. Even with a full range of effects such as textures, lighting and transparency, a sustained rate of 20 million polygons per second can be drawn continuously. Accordingly, the Graphics Synthesiser is capable of rendering a film-quality image.
The Sound Processor Unit (SPU) 300 is effectively the soundcard of the system which is capable of recognising 3D digital sound such as Digital Theater Surround (DTS ) sound and AC-3 (also known as Dolby Digital) which is the sound format used for DVDs.
A display and sound output device 305, such as a video monitor or television set with an associated loudspeaker arrangement 310, is connected to receive video and audio signals from the graphics synthesiser 200 and the sound processing unit 300.
The main memory supporting the Emotion Engine 100 is the RDRAM (Rambus Dynamic Random Access Memory) module 500 produced by Rambus Incorporated. This RDRAM memory subsystem comprises RAM, a RAM controller and a bus connecting the RAM to the Emotion Engine 100.
Figure 2 schematically illustrates the architecture of the Emotion Engine 100 of Figure 1. The Emotion Engine 100 comprises: a floating point unit (FPU) 104; a central processing unit (CPU) core 102; vector unit zero (VUO) 106; vector unit one (VU 1) 108; a graphics interface unit (GIF) 110; an interrupt controller (1NC) 112; a timer unit 114; a direct memory access controller 116; an image data processor unit (IPU) 118; a dynamic random access memory controller (DRAMC) 120; a sub-bus interface (SIF) 122; and all of these components are connected via a 128-bit main bus 124.
The CPU core 102 is a 128-bit processor clocked at 300 MHz. The CPU core has access to 32 MB of main memory via the DRAMC 120. The CPU core 102 instruction set is based on MIPS HI RISC with some MIPS IV RISC instructions together with additional multimedia instructions. MIPS Ill and IV are Reduced Instruction Set Computer (RISC) instruction set architectures proprietary to MIPS Technologies, Inc. Standard instructions are 64-bit, two-way superscalar, which means that two instructions can be executed simultaneously. Multimedia instructions, on the other hand, use 128-bit instructions via two pipelines. The CPU core 102 comprises a 16KB instruction cache, an 8KB data cache and a 16KB scratchpad RAM which is a portion of cache reserved for direct private usage by the CPU.
The FPU 104 serves as a first co-processor for the CPU core 102. The vector unit 106 acts as a second co-processor. The FPU 104 comprises a floating point product sum arithmetic logic unit (FMAC) and a floating point division calculator (FDIV). Both the FMAC and FDIV operate on 32bit values so when an operation is carried out on a 128-bit value (composed of four 32-bit values) an operation can be carried out on all four parts concurrently. For example adding 2 vectors together can be done at the same time.
The vector units 106 and 108 perform mathematical operations and are essentially is specialised FPUs that are extremely fast at evaluating the multiplication and addition of vector equations. They use Floating-Point Multiply-Adder Calculators (FMACs) for addition and multiplication operations and Floating-Point Dividers (FDIVs) for division and square root operations. They have built-in memory for storing micro-programs and interface with the rest of the system via Vector Interface Units (VIFs). Vector unit zero 106 can work as a coprocessor to the CPU core 102 via a dedicated 128-bit bus so it is essentially a second specialised FPU. Vector unit one 108, on the other hand, has a dedicated bus to the Graphics synthesiser 200 and thus can be considered as a completely separate processor. The inclusion of two vector units allows the software developer to split up the work between different parts of the CPU and the vector units can be used in either serial or parallel connection.
Vector unit zero 106 comprises 4 FMACS and I FDIV. It is connected to the CPU core 102 via a coprocessor connection. It has 4 Kb of vector unit memory for data and 4 Kb of micro-memory for instructions. Vector unit zero 106 is useful for performing physics calculations associated with the images for display. It primarily executes non-patterned geometric processing together with the CPU core 102.
Vector unit one 108 comprises 5 FMACS and 2 FDIVs. It has no direct path to the CPU core 102, although it does have a direct path to the GIF unit 110. It has 16 Kb of vector unit memory for data and 16 Kb of micro-memory for instructions. Vector unit one 108 is useful for performing transformations. It primarily executes patterned geometric processing and directly outputs a generated display list to the GIF 110.
The GIF 110 is an interface unit to the Graphics Synthesiser 200. It converts data according to a tag specification at the beginning of a display list packet and transfers s drawing commands to the Graphics Synthesiser 200 whilst mutually arbitrating multiple transfer. The interrupt controller (1NTC) 112 serves to arbitrate interrupts from peripheral devices, except the DMAC 116.
The timer unit 114 comprises four independent timers with 16-bit counters. The timers are driven either by the bus clock (at 1/16 or 1/256 intervals) or via an external io clock. The DMAC 116 handles data transfers between main memory and peripheral processors or main memory and the scratch pad memory. It arbitrates the main bus 124 at the same time. Performance optimisation of the DMAC 116 is a key way by which to improve Emotion Engine performance. The image processing unit (IPU) 118 is an image data processor that is used to expand compressed animations and texture images. It performs I-PICTURE Macro-Block decoding, colour space conversion and vector quantisation. Finally, the sub-bus interface (SIF) 122 is an interface unit to the lOP 700.
It has its own memory and bus to control 110 devices such as sound chips and storage devices.
Figure 3 schematically illustrates the configuration of the Graphic Synthesiser 200. The Graphics Synthesiser comprises: a host interface 202; a set-up / rasterizing unit; a pixel pipeline 206; a memory interface 208; a local memory 212 including a frame page buffer 214 and a texture page buffer 216; and a video converter 210.
The host interface 202 transfers data with the host (in this case the CPU core 102 of the Emotion Engine 100). Both drawing data and buffer data from the host pass through this interface. The output from the host interface 202 is supplied to the graphics synthesiser 200 which develops the graphics to draw pixels based on vertex information received from the Emotion Engine 100, and calculates information such as RGBA value, depth value (i.e. Z-value), texture value and fog value for each pixel. The RGBA value specifies the red, green, blue (ROB) colour components and the A (Alpha) component represents opacity of an image object. The Alpha value can range from completely transparent to totally opaque. The pixel data is supplied to the pixel pipeline 206 which performs processes such as texture mapping, fogging and Alpha-blending and determines the final drawing colour based on the calculated pixel information.
The pixel pipeline 206 comprises 16 pixel engines PEt, PE2, PEI6 so that it can process a maximum of 16 pixels concurrently. The pixel pipeline 206 runs at 150MHz with 32-bit colour and a 32-bit Z-buffer. The memory interface 208 reads data from and writes data to the local Graphics Synthesiser memory 212. It writes the drawing pixel values (RGBA and Z) to memory at the end of a pixel operation and reads the pixel values of the frame buffer 214 from memory. These pixel values read from the frame buffer 214 are used for pixel test or Alpha-blending. The memory interface 208 also reads from local memory 212 the RGBA values for the current contents of the frame buffer. The local memory 212 is a 32 Mbit (4MB) memory that is built-in to the Graphics Synthesiser 200. It can be organised as a frame buffer 214, texture buffer 216 and a 32-bit Z-buffer 215. The frame buffer 214 is the portion of video memory where pixel data such as colour information is stored.
The Graphics Synthesiser uses a 2D to 3D texture mapping process to add visual detail to 3D geometry. Each texture may be wrapped around a 3D image object and is stretched and skewed to give a 3D graphical effect. The texture buffer is used to store the texture information for image objects. The Z-buffer 215 (also known as depth buffer) is the memory available to store the depth information for a pixel. Images are constructed from basic building blocks known as graphics primitives or polygons. When a polygon is rendered with Z-buffering, the depth value of each of its pixels is compared with the corresponding value stored in the Z-buffer. If the value stored in the Z-buffer is greater than or equal to the depth of the new pixel value then this pixel is determined visible so that it should be rendered and the Z-buffer will be updated with the new pixel depth. If however the Z-buffer depth value is less than the new pixel depth value the new pixel value is behind what has already been drawn and will not be rendered.
The local memory 212 has a 1024-bit read port and a 1024-bit write port for accessing the frame buffer and Z-buffer and a 512-bit port for texture reading. The video converter 210 is operable to display the contents of the frame memory in a specified output format.
Figure 4 schematically illustrates four system units IOa, lOb, lOc and lOd networked together over a network 800, which may be, for example, a local area network (LAN), a wide area network (WAN), and/or the Internet. The network 800 may be a network that is dedicated to the communication of game data, designed to be suited to the communication of such data. The network 800 may comprise other devices (not shown) such as network servers.
For the purposes of the present description, it is assumed that the system units I Oa, lob, I Oc and I Od are located sufficiently far apart that words spoken by one player cannot be heard directly by another. This represents a normal use of such system units but, of course, is not an essential technical feature of the invention.
In the example illustrated in Figure 4, each of the system units 1 Oa, I Ob, 1 Oc and lOd is connected to a network adapter 805a, 805b, 805c and 805d respectively. The network adapters 805a, 805b, 805c and 805d provide an Ethernet interface to the network 800 for the system units lOa, lOb, lOc and lOd. In other configurations though, the system units IOa, lOb, lOc and lOd may be networked together via different means, for example by making use of their iLink ports.
The four system units I Oa, lob, I Oc and I Od are arranged to collaborate so that a networked game may be played by human players 810a, 810b, SlOc', 810c" and 810d.
Note that the third system unit lOc is being used by more thanone player. It will be appreciated that this is an example arrangement of system units 10 and players and that, in principle, any number of systems units 10 and players may be involved. In practice, due to bandwidth and data manageability constraints, there may be an upper limit on the number of system units 10 and/or players. For example, currently, up to twenty players may be involved in a networked game.
The data flowing over the network 800 between the system units lOa, lOb, lOc and 1 Od may comprise a variety of information, such as one or more of: (i) the inputs of the players 810a, 810b, 810c', 810c" and 810d via their respective hand-held game controllers 725; (ii) data relating to a game character associated with each of the players 81 Oa, 810b, 810c', SIOc" and 810d, such as position, health and game items collected; (iii) data relating to other game characters (such as computer generated and controlled game monsters); (iv) the current scores of the players 810a, 810b, 810c', SlOc" and 810d; (v) actions performed by the players 8lOa, 810b, 810c', 810c" and 810d and how they affect the game environment, such as opening a door in the game environment; and (vi) audio data, such as voice data input by the players 810a, 8 lOb, 810c', 8lOc" and 810d via their respective microphones 730. S 11
It will be appreciated that, depending on the particular game being played, other information may be transferred across the network.
Methods by which one system unit 10 collaborates with other system units 10 to provide a network game are well known and will not be described in detail. However, as an example, each of the system units iDa, lOb, lOc and lOd is provided with its own DVI) storing a version of the game software. A game session is initiated by the player 810a and the other players 810b, 810c', SlOe" and 810d are then invited to join the game session.
Once all of the players 810a, 810b, 810c', 810c" and 810d have joined the game session, they may then play the game together. Each of the system units lOa, lOb, lOc and lOd may process game tasks that are essentially local, such as rendering video and audio data and processing a player's input. Other game processing tasks concern more than one of the system units lOa, lOb, lOc and lOd, such as maintaining a table of game scores and deciding the actions of computer generated monsters (which are not controlled by any of the players 810a, SlOb, 810c', 810c" or 810d). Such tasks may be performed by just one of the system units IOa, lOb, lOc or lOd, each of the system units lOa, lOb, lOc and lOd being informed of the outcome of the processing as appropriate, so that all of the system units lOa, lOb, lOc and lOd can operate together with a consistent game environment.
It will be appreciated that in other configurations, one or more other networked devices, such as a network server (not shown) may undertake some of the processing in order to provide the networked game, for example deciding the actions of computer generated monsters or maintaining a table of game scores.
Figure 5 is a schematic flow chart of an exemplary embodiment for controlling audio communication in a networked game. In this exemplary embodiment, the control of audio communication (such as a talk-channel) is granted to the player who currently possesses or has achieved the highest score. This means that the player with the highest score can provide audio data across the network to other players (i.e. talk at other players), but none of the other players can provide audio data across the network (i.e. they can only listen). As such, a player's actions are rewarded with control of audio communication if those actions result in the player holding the highest score.
At a step S900, it is determined which player currently has the highest score. As described above, one of the system units 10 involved in thenetworked game may maintain a table of the players' current game scores. This system unit 10 shall be referred to as the scoring system unit. At the step S900, the scoring system unit uses the table of scores to determine which player currently has the highest score. If two or more players share the highest score, then one of them is selected. This may be done, for example, by a random selection; alternatively, the selection may be based on other game statistics (such as accuracy of aim in a game involving firing weapons).
At a step S902, the control of audio communication is granted to the player who has been determined to have the highest score. This player shall be referred to as the talking-player. The result of this is that audio data can be transferred across the network from the talking-player to the other players; in contrast, no audio data from the other players can be transferred across the network. In other words, the talking-player can talk to/at the other players over the network whilst the other players are not able to talk to anybody at all over the network. The scoring system unit instructs the system units 10 involved in the networked game as to which player is the talking-player; each of the system units 10 then handles audio communication appropriately. Alternatively, the scoring system unit provides explicit instructions to the other system units 10 about how to handle audio communication. Appropriate handling of audio communication may be achieved by: (i) the talking-player's system unit 10 allowing itself to transmit audio data across the network and the other system units 10 prohibiting themselves from transmitting audio data across the network; (ii) every system unit 10 allowing itself to receive audio data from the talking player's system unit 10 and prohibiting itself from receiving audio data from any other system unit 10 (by blocking certain network addresses for
example); or
(iii) every system unit 10 allowing itself to render audio data received from the talking player's system unit 10 and prohibiting itself from rendering audio data received from any other system unit 10.
At a step S904, some or all of the players involved in the network game are informed of the identity of the talking-player. This may involve an indication using text or icons displayed on the display device 305; alternatively, the appearance of the talking-player's game character may be altered to give an indication to the other players (for example, the game character may flash or take on a glowing appearance). If the current identity of the talking-player is the same as the identity of the immediately preceding talking-player then the players need not be re-informed of the identity of the talking-player.
In a game involving rapidly changing scores, it is conceivable that the identity of the player with the highest score also changes rapidly. In order to prevent the identity of the talking-player changing too rapidly (which would result in very short/meaningless/incomplete communication), the talking-player is given control of audio communication for at least a minimum period of time, for example 20 seconds.
Therefore, at a step S906, the scoring system unit resets a wait period. At a step S908, the scoring system unit tests for the expiration of the wait period. Once the wait period has expired, processing returns to the step S900.
It will be appreciated that other variants of this exemplary embodiment exist. For
example:
(i) more than one player may be granted the ability to talk over the network depending on the players' scores; and (ii) the processing of the wait period at the steps S906 and S908 may be omitted (or the wait period set to zero) to allow a continual assessment of control of audio communication.
Figure 6 is a schematic flow chart of an embodiment of the present invention for controlling audio communication in a networked game. In this second embodiment, the control of audio communication (such as a talk-channel) is granted to a player (the talking-player) whose game character has collected one or more game objects of a particular type or types. This may be achieved, for example, by a player controlling a game character to move to a certain location and then instructing the game character to collect an object at that location. For example, a microphone-object may be provided in the game environment which, if collected by a player's game character, provides that player with control of audio communication. As with the first embodiment, control of audio communication allows the talking-player to provide audio data across the network to other players (i.e. talk a: other players) whilst none of the other players can provide audio data across the network (i.e. they can only listen).
At a step SI 000, a player's system unit 10 waits for input from the player, for example via the hand-held game controller 725. This input may be, for example, to move the player's game character within the game environment.
At a step S1002, the system unit 10 determines whether or not the player's character has collected a communication-object (i.e. a game object that permits the player to communicate audio data across the network). If the player's character has not collected S 14 a communication-object, processing returns to the step S1000; otherwise processing continues at a step S 1004.
At the step S 1004, the player's system unit 10 gives control of audio communication to the player and the player become the talking-player. This is achieved in a similar manner as at the step S902 of Figure 5.
At a step S1006, the talking-player's system unit 10 informs some or all of the players involved in the network game of the identity of the talking-player. This is done in a similar manner as at the step S904 of Figure 5.
At a step S 1008, the talking-player's system unit 10 starts a wait period and tests for its expiration at a step SlOb. The wait period grants the talking-player control of audio communication for a limited period of time.
Once the wait period has expired, at a step SI 012, the talking-player's system unit removes control of audio communication from the talking-player, i.e. the player is no longer the talking-player and is not able to communicate audio information to other players across the network. This may be achieved, for example, by (i) the talking-player's system unit 10 prohibiting itself from transmitting audio data across the network; (ii) the talking player's system unit 10 instructing every system unit 10 involved in the networked game to prohibit itself from receiving audio data from the talking-player's system unit 10 (by blocking the network address of the talking-player's system unit for example); or (iii) the talking player's system unit 10 instructing every system unit 10 involved in the networked game to prohibit itself from rendering audio data received from the talking-player's system unit 10.
At a step S1014, a new communication-object is created within the game environment which may then be collected. The creation may be, for example, immediately, at a random point of time after the preceding communication-object had been collected or at a predetermined point of time after the preceding communication-object had been collected.
It will be appreciated that other variants of the second embodiment exist. For
example:
(i) the steps S1008, SIOlO and S1012 may be omitted, thereby granting the player control of audio communication until another player collects a communication-object; (ii) a player's game character may need to collect more than one communication-object (jotentialIy of different types) before that player is granted control of audio communication; (iii) a player may need to perform further actions, such as instructing the game character to activate/use a collected communication-object, before control of audio communication is granted to the player; (iv) the step SlOl4 may be omitted so that communication-objects are not replaced after having been collected and/or used; (v) multiple communication-objects may be distributed at different positions within the game environment (such as different rooms); and/or (vi) two or more players may collect communication-objects and be granted overlapping or concurrent talk periods.
Figure 7 is a schematic flow chart of a third embodiment for controlling audio communication in a networked game. In this third embodiment, the control of audio communication (such as a talk-channel) is granted to a player (the talking-player) who has controlled their game character to be located at a specific location in the game environment and/or to a player who has reached a certain level/stage within the game. As with the first two embodiments, control of audio communication allows the talking-player to provide audio data across the network to other players (i.e. talk at other players) whilst none of the other players can provide audio data across the network (i.e. they can only listen).
At a step Si 100, the player's system unit 10 tests whether the player's game character is located at a specific location in the game environment and/or whether the player has reached a certain level/stage within the game.
At a step Si 102, the player's system unit 10 gives control of audio communication to the player and the player become the talking-player. This is achieved in a similar manner as at the step S902 of Figure 5.
At a step SI 104, the talking-player's system unit 10 informs some or all of the players involved in the network game of the identity of the talking-player. This is done in a similar manner as at the step S904 of Figure 5.
At a step SI 106, the talking-player's system unit 10 starts a wait period and tests for its expiration at a step SI 108. The wait period grants the talking-player control of audio communication for a limited period of time.
Once the wait period has expired, at a step SI 110, the talking-player's system unit removes control of audio communication from the talking-player, i.e. the player is no longer the talking-player and is not able to communicate audio information to other players across the network. This may be achieved in a similar manner as at the step SlOI2ofFigure6.
It will be appreciated that other variants of the third embodiment exist. For example, the steps SI 106, S 1108 and Sil 10 may be omitted so that a player retains control of audio communication whilst that player's game character is located at a specific location.
Figure 8 is a schematic flow chart of a fourth embodiment for controlling audio communication in a networked game, similar to the third embodiment. In this fourth embodiment, the control of audio communication (such as a talk-channel) is granted to a player (the talking-player) whose game character has just performed a certain action.
These actions could be actions against another player's game character (such as severely wounding or killing the other player's game character, or overtaking the other player's game character in a race game); alternatively they could be more individualistic actions (such as achieving a good score or firing a particularly accurate shot in a shooting game).
As with the first three embodiments, control of audio communication allows the talking-player to provide audio data across the network to other players (i.e. talk at other players) whilst none of the other players can provide audio data across the network (i.e. they can only listen). This allows the talking player to, for example, gloat about the action that has just been performed.
At a step S1500, the player's system unit 10 tests whether the player's game character has performed a certain game action (as described above).
At a step S1502, the player's system unit 10 gives control of audio communication to the player and the player become the talking-player. This is achieved in a similar manner as at the step S902 of Figure 5.
At a step SI 504, the talking-player's system unit 10 informs some or all of the players involved in the network game of the identity of the talking-player. This is done in a similar manner as at the step S904 of Figure 5.
At a step S 1506, the talking-player's system unit 10 starts a wait period and tests for its expiration at a step SI 508. The wait period grants the talking-player control of audio communication for a limited period of time.
Once the wait period has expired, at a step S1510, the talking-player's system unit removes control of audio communication from the talking-player, i.e. the player is no longer the talking-player and is not able to communicate audio information to other players across the network. This may be achieved in a similar manner as at the step S10l2 of Figure 6.
It will be appreciated that other variants of the fourth embodiment exist. For example, the steps S1506, S1508 and S1510 may be omitted so that a player retains control of audio communication.
In a fifth embodiment, two players may communicate audio with each other across the network (for example within a talk-channel) if they have positioned their game characters in the game environment so that the distance between their game characters (as measured in the game environment) is less than a threshold distance. Thus only players whose game characters are near to each other may talk with each other over the network.
Several game characters may be sufficiently close to each other to allow a group of players to hold a conversation; these game characters will be said to form a conversation group.
Figure 9 schematically illustrates an example positioning of game characters within a game environment. Two game characters I 200a and I 200b are located near each other (i.e. the distance between them as measured in the game environment is less than the threshold distance), thus allowing their controlling players to talk to each other over the network. The game characters I 200a and I 200b thus form a conversation group 121 Oab.
Similarly, three game characters I 200c, I 200d and I 200e are located near each other, thus allowing their controlling players to talk to each other over the network. The game characters I 200c, 1 200d and I 200e thus form a conversation group 121 Ocde.
A game character 1200f is located near to a game character 1200g, thus allowing their controlling players to talk to each other over the network. The game characters 1200f and 1200g thus form a conversation group l2IOfg. The game character 1200f is also located near to a game character I 200h, thus allowing their controlling players to talk to each other over the network. The game characters 1200f and 1 200h thus form a conversation group l2lOfh. However, as the game character 1200h is not located close enough to the game character I 200g, the players controlling these game characters cannot talk to each other over the network.
Finally, a game character 12001 is not located close to any of the other game characters within the game environment. The player controlling the game character I 200i therefore cannot talk to any of the other players over the network.
During the game, a player's system unit 10 is provided with positional information relating to the game characters of the other players involved in the network game. This information may be provided to the system unit 10 directly from each of the other system units 10 involved in the network game; alternatively, as described above, to maintain game consistency one of the system units 10 may be responsible for collating the positional information of each of the player's game characters and then forwarding it to every system unit 10 involved in the network game. The player's system unit 10 then calculates the distance between the game characters. This may be a direct point-to-point distance; alternatively, it may be the shortest distance within the confines of the game environment (such as the distance of a route within a maze).
The system unit 10 of a first player only renders audio information received from a second player if the game character of the second player is sufficiently close to the game character of the first player. This may be achieved in several ways, for example: (i) the first player's system unit 10 allowing itself to transmit audio information to the second player's system unit if their game characters are sufficiently near each other, otherwise such transmission is prohibited; (ii) the first player's system unit 10 allowing itself to receive audio information from the second player's system unit if their game characters are sufficiently near each other, otherwise such receipt is prohibited (by blocking the network address of the second player's system unit for
example); or
(iii) the first player's system unit 10 allowing itself to render audio information received from the second player's system unit if their game characters are sufficiently near each other, otherwise such rendering is prohibited.
It will be appreciated that other variants of the fifth embodiment exist. For
example:
(i) as the conversation groups 121 Ofg and 121 0th share a common game character (namely the game character I 2000, then the conversation groups l2lOfg and 12 10th may be merged to allow the players controlling the game characters I 200f, 1 200g and I 200h to communicate with each other across the network;
S
(ii) the player controlling the game character 1200f may be provided with the ability to select with whom he would like to talk (i.e. with the player controlling the game character 1 200g or with the player controlling the game character 1200h or with both); (iii) the constituents of the conversation groups may be determined by a single system unit 10 which then informs the all of the system units 10 involved in the network game about the conversation groups; and (iv) in order to control the number of game characters that form a conversation group, the threshold distance used to determine the constituents of a conversation group may be dynamically adjusted to enforce an upper limit on the size of the conversation group.
In any of the embodiments described above, there may be game characters that are generated and control led by one or more of the system units 10. For example, in a fighting game, a player may combat one or more opponent game characters that are generated and controlled by a system unit 10. It will be appreciated that, during the game, a computer controlled game character may, by virtue of its position, score, actions, and/or game items collected, be able to communicate audio data to one or more of the human players. Such audio data may vary depending on the current status of the game. For example, a computer controlled game character may boast that it is about to defeat a human controlled opponent.
It will be appreciated that the audio data controlled by any of the embodiments described above may be only a subset of the total audio data communicated during the game. For example, the communication of data concerning verbal inputs by the players may be controlled according to any of the embodiments described above whilst the communication of other audio data (such as background music) may be controlled by other means.
It will be appreciated that the embodiments described may be combined to provide a variety of means by which a player may gain control of audio communication.
In so far as the embodiments of the invention described above are implemented, at least in part, using software-controlled data processing apparatus, it will be appreciated that a computer program providing such software control, a storage medium by which such a computer program is stored and a transmission medium by which such a computer program is transmitted are envisaged as aspects of the present invention.

Claims (30)

  1. I. A network comprising: a plurality of processing apparatus, at least two of the processing apparatus being games control terminals, each games control terminal being operable to transmit audio data to and receive audio data from another one of the games control terminals and to render received audio data; the network providing controlling logic operable to determine whether the games control terminals may perform an audio-communication-task, and logic operable to provide a game environment; an audio-communication-task being one or more of the transmission of audio data from a games control terminal to another games control terminal; the reception by a games control terminal of audio data transmitted from another games control terminal; and the rendering by a games control terminal of audio data received from another games is control terminal; the determination being dependent upon a game status associated with at least one of the games control terminals and/or at least one operator of a games control terminal; and in which the game status comprises one or more associations made between communicationitems found within the game environment and operators.
  2. 2. A network according to claim I in which the game status is dependent upon at least one action performed by an operator of at least one of the games control terminals.
  3. 3. A network according to claim 1 or 2, in which the controlling logic determines that a games control terminal may perform an audio-communication-task if the operator of the games control terminal is associated with one or more of the communication-items.
  4. 4. A network according to any one of the preceding claims, in which the controlling logic determines that a games control terminals may not perform an audio-communication-task in respect of another games control terminal if the operator of the other games control terminal is not associated with one or more of the communication-items.
  5. 5. A network according to any one of the preceding claims, in which an operator becomes disassociated with a communication-item after a predetermined period of time after the beginning of the association of the operator with the communication-item.
  6. 6. A network according to any one of the preceding claims, the network providing logic operable to associate an operator with a position within the game environment.
  7. 7. A network according to claim 6, the network providing logic operable to designate one or more positions within the game environment as a communication-position, the w game status comprising the determinations of whether the positions associated with the operators are communication-positions.
  8. 8. A network according to claim 7, in which the controlling logic determines that a games control terminal may perform an audio-communication-task if the position associated with an operator of that games control terminal is a communication-position.
  9. 9. A network according to claim 5 or claim 6, in which the controlling logic determines that a games control terminal may not perform an audio-communication-task in respect of another games control terminal if the position associated with an operator of that other games control terminal is not a communication-position.
  10. A network according to claim 6, in which the game status comprises the distance between the positions associated with the operators.
  11. II. A network according to claim 10, in which the controlling logic determines that the a games control terminal may perform an audio-communication-task in respect of another games control terminal if the distance between the position associated with an operator of that games control terminal and the position associated with an operator of the other games control terminal is less than a threshold distance.
  12. 12. A network according to claim 11, in which the threshold distance is a predetermined distance.
  13. 13. A network according to claim ii, operable to dynamically adjust the threshold distance.
  14. 14. A network according to any one of the preceding claims operable to associate at least one game character with each games control terminal and/or each operator.
  15. 15. A network according to claim 14 in which the game status comprises one or more communication-enabling-actions performed by a game character.
  16. 16. A network according to claim 15, in which the controlling logic determines that a games control terminal may perform an audio-communication-task if the game character associated with the games control terminal or the game character associated with the operator of the games control terminal performs one or more communication-enabling-actions.
  17. 17. A network according to claim 14 or claim 15, in which the controlling logic determines that a games control terminal may not perform an audio-communication-task in respect of another games control terminal if the game character associated with the other games control terminal or the game character associated with the operator of the other games control terminal performs one or more communication-enabling-actions.
  18. 18. A network according to any one of the preceding claims, the controlling logic being operable to provide information to an operator of one of the games control terminals about a determination made by the controlling logic.
  19. 19. A network according to claim 18, in which the information is visual and/or audible information.
  20. 20. A network according to any one of the preceding claims, in which audio data is input by an operator of a games control terminal by a microphone connected to the games control terminal. S 23
  21. 21. A network according to any one of the preceding claims, in which the audio data controlled by the controlling logic forms a subset of the total amount of audio data communicated between the games control terminals.
  22. 22. A network substantially as hereinbefore described with reference to the accompanying drawings.
  23. 23. A method of controlling audio data within a network for providing a game environment, the network comprising: a plurality of processing apparatus, at least two of the processing apparatus being games control terminals, each games control terminal being operable to communicate audio data to and receive audio data from another one of the games control terminals and to render received audio data; the method comprising the steps of: (a) forming a game status associated with at least one operator of at least one of the games control terminals; and (b) in dependence upon the games status, allowing or disallowing one or more of: a games control terminal communicating audio data to another games control terminal; a games control terminal receiving audio data from another games control terminal; and a games control terminal rendering audio data received from another games control terminal, and in which the game status comprises one or more associations made between communication-items found within the game environment and operators.
  24. 24. A method of controlling audio data within a network substantially as hereinbefore described with reference to the accompanying drawings.
  25. 25. Computer software having program code for carrying out a method according to claim 23.
  26. 26. A providing medium by which software according to claim 25 is provided.
  27. 27. A medium according to claim 26, the medium being a transmission medium.
  28. 28. A medium according to claim 26, the medium being a storage medium.
  29. 29. A games control terminal connectable to a network and operable to transmit audio data to and receive audio data from another one of the games control terminals and to render received audio data; the terminal providing controlling logic operable to determine whether the terminal may perform an audio-communication-task and logic operable to provide a game environment; an audio-communication-task being one or more of the transmission of audio data from the terminal to another games control terminal connected to the network; the reception by the terminal of audio data transmitted from another games control terminal; and the rendering by the terminal of audio data received from another games control terminal; is the determination being dependent upon a game status associated with the terminal or one or more other games control terminals connected to the network and/or at least one operator of a games control terminal connected to the network, and the game status comprising one or more associations made between communication-items found within the game environment and operators.
  30. 30. A games control terminal substantially as hereinbefore described with reference to the accompanying drawings.
GB0805290A 2004-06-25 2004-06-25 Game processing Expired - Lifetime GB2446529B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB0805290A GB2446529B (en) 2004-06-25 2004-06-25 Game processing

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB0805290A GB2446529B (en) 2004-06-25 2004-06-25 Game processing
GB0414308A GB2415392B (en) 2004-06-25 2004-06-25 Game processing

Publications (3)

Publication Number Publication Date
GB0805290D0 GB0805290D0 (en) 2008-04-30
GB2446529A true GB2446529A (en) 2008-08-13
GB2446529B GB2446529B (en) 2008-11-05

Family

ID=32800217

Family Applications (2)

Application Number Title Priority Date Filing Date
GB0805290A Expired - Lifetime GB2446529B (en) 2004-06-25 2004-06-25 Game processing
GB0414308A Expired - Lifetime GB2415392B (en) 2004-06-25 2004-06-25 Game processing

Family Applications After (1)

Application Number Title Priority Date Filing Date
GB0414308A Expired - Lifetime GB2415392B (en) 2004-06-25 2004-06-25 Game processing

Country Status (2)

Country Link
GB (2) GB2446529B (en)
WO (1) WO2006000786A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8390670B1 (en) 2008-11-24 2013-03-05 Shindig, Inc. Multiparty communications systems and methods that optimize communications based on mode and available bandwidth
US8647206B1 (en) 2009-01-15 2014-02-11 Shindig, Inc. Systems and methods for interfacing video games and user communications
US9401937B1 (en) 2008-11-24 2016-07-26 Shindig, Inc. Systems and methods for facilitating communications amongst multiple users
US9712579B2 (en) 2009-04-01 2017-07-18 Shindig. Inc. Systems and methods for creating and publishing customizable images from within online events
US9711181B2 (en) 2014-07-25 2017-07-18 Shindig. Inc. Systems and methods for creating, editing and publishing recorded videos
US9734410B2 (en) 2015-01-23 2017-08-15 Shindig, Inc. Systems and methods for analyzing facial expressions within an online classroom to gauge participant attentiveness
US9733333B2 (en) 2014-05-08 2017-08-15 Shindig, Inc. Systems and methods for monitoring participant attentiveness within events and group assortments
US9779708B2 (en) 2009-04-24 2017-10-03 Shinding, Inc. Networks of portable electronic devices that collectively generate sound
US9947366B2 (en) 2009-04-01 2018-04-17 Shindig, Inc. Group portraits composed using video chat systems
US9952751B2 (en) 2014-04-17 2018-04-24 Shindig, Inc. Systems and methods for forming group communications within an online event
US10133916B2 (en) 2016-09-07 2018-11-20 Steven M. Gottlieb Image and identity validation in video chat events
US10271010B2 (en) 2013-10-31 2019-04-23 Shindig, Inc. Systems and methods for controlling the display of content

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100500766C (en) 2004-06-30 2009-06-17 陶氏康宁公司 Fluorocarbon elastomer silicone vulcanizates
JP4074879B1 (en) * 2006-10-18 2008-04-16 株式会社コナミデジタルエンタテインメント GAME DEVICE, MESSAGE DISPLAY METHOD, AND PROGRAM
JP5017013B2 (en) 2007-08-08 2012-09-05 株式会社コナミデジタルエンタテインメント Network game system, network game system control method and program
US20090049128A1 (en) * 2007-08-17 2009-02-19 Sony Computer Entertainment America Inc. Schemes for game chat routing and taunt control
JP5957177B2 (en) * 2007-12-21 2016-07-27 ドルビー ラボラトリーズ ライセンシング コーポレイション Asynchronous audio for network games
JP2011510409A (en) * 2008-01-17 2011-03-31 ヴィヴォックス インコーポレイテッド A scalable technique for providing real-time avatar-specific streaming data in a virtual reality system using an avatar-rendered environment
CN109873751B (en) * 2019-01-11 2020-10-09 珠海格力电器股份有限公司 Group chat voice information processing method and device, storage medium and server
EP4311585A1 (en) * 2022-07-29 2024-01-31 Utopia Music AG Method for tracking audio consumption in virtual environments

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6241612B1 (en) * 1998-11-09 2001-06-05 Cirrus Logic, Inc. Voice communication during a multi-player game
US20040109023A1 (en) * 2002-02-05 2004-06-10 Kouji Tsuchiya Voice chat system
EP1518592A1 (en) * 2003-09-25 2005-03-30 Microsoft Corporation Visual indication of current voice speaker

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001314657A (en) * 2000-05-08 2001-11-13 Sega Corp Network system and storage medium
US6935959B2 (en) * 2002-05-16 2005-08-30 Microsoft Corporation Use of multiple player real-time voice communications on a gaming device
US7464272B2 (en) * 2003-09-25 2008-12-09 Microsoft Corporation Server control of peer to peer communications

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6241612B1 (en) * 1998-11-09 2001-06-05 Cirrus Logic, Inc. Voice communication during a multi-player game
US20040109023A1 (en) * 2002-02-05 2004-06-10 Kouji Tsuchiya Voice chat system
EP1518592A1 (en) * 2003-09-25 2005-03-30 Microsoft Corporation Visual indication of current voice speaker

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9661270B2 (en) 2008-11-24 2017-05-23 Shindig, Inc. Multiparty communications systems and methods that optimize communications based on mode and available bandwidth
US10542237B2 (en) 2008-11-24 2020-01-21 Shindig, Inc. Systems and methods for facilitating communications amongst multiple users
US8390670B1 (en) 2008-11-24 2013-03-05 Shindig, Inc. Multiparty communications systems and methods that optimize communications based on mode and available bandwidth
US8902272B1 (en) 2008-11-24 2014-12-02 Shindig, Inc. Multiparty communications systems and methods that employ composite communications
US8917310B2 (en) 2008-11-24 2014-12-23 Shindig, Inc. Multiparty communications systems and methods that optimize communications based on mode and available bandwidth
US9041768B1 (en) 2008-11-24 2015-05-26 Shindig, Inc. Multiparty communications systems and methods that utilize multiple modes of communication
US8405702B1 (en) 2008-11-24 2013-03-26 Shindig, Inc. Multiparty communications systems and methods that utilize multiple modes of communication
US9215412B2 (en) 2008-11-24 2015-12-15 Shindig, Inc. Multiparty communications systems and methods that optimize communications based on mode and available bandwidth
US9357169B2 (en) 2008-11-24 2016-05-31 Shindig, Inc. Multiparty communications and methods that utilize multiple modes of communication
US9782675B2 (en) 2008-11-24 2017-10-10 Shindig, Inc. Systems and methods for interfacing video games and user communications
US9401937B1 (en) 2008-11-24 2016-07-26 Shindig, Inc. Systems and methods for facilitating communications amongst multiple users
US9124760B2 (en) 2009-01-15 2015-09-01 Shindig, Inc. Systems and methods for interfacing video games and user communications
US8647206B1 (en) 2009-01-15 2014-02-11 Shindig, Inc. Systems and methods for interfacing video games and user communications
US9737804B2 (en) 2009-01-15 2017-08-22 Shindig, Inc. Systems and methods for interfacing video games and user communications
US9947366B2 (en) 2009-04-01 2018-04-17 Shindig, Inc. Group portraits composed using video chat systems
US9712579B2 (en) 2009-04-01 2017-07-18 Shindig. Inc. Systems and methods for creating and publishing customizable images from within online events
US9779708B2 (en) 2009-04-24 2017-10-03 Shinding, Inc. Networks of portable electronic devices that collectively generate sound
US10271010B2 (en) 2013-10-31 2019-04-23 Shindig, Inc. Systems and methods for controlling the display of content
US9952751B2 (en) 2014-04-17 2018-04-24 Shindig, Inc. Systems and methods for forming group communications within an online event
US9733333B2 (en) 2014-05-08 2017-08-15 Shindig, Inc. Systems and methods for monitoring participant attentiveness within events and group assortments
US9711181B2 (en) 2014-07-25 2017-07-18 Shindig. Inc. Systems and methods for creating, editing and publishing recorded videos
US9734410B2 (en) 2015-01-23 2017-08-15 Shindig, Inc. Systems and methods for analyzing facial expressions within an online classroom to gauge participant attentiveness
US10133916B2 (en) 2016-09-07 2018-11-20 Steven M. Gottlieb Image and identity validation in video chat events

Also Published As

Publication number Publication date
GB2446529B (en) 2008-11-05
WO2006000786A1 (en) 2006-01-05
GB0805290D0 (en) 2008-04-30
GB2415392B (en) 2008-11-05
GB0414308D0 (en) 2004-07-28
GB2415392A (en) 2005-12-28

Similar Documents

Publication Publication Date Title
WO2006000786A1 (en) Real-time voice-chat system for an networked multiplayer game
EP1880576B1 (en) Audio processing
US20020142834A1 (en) Game screen switching method performed in game machine and network game system, and program for executing the method
US20050245317A1 (en) Voice chat in game console application
JP2010535363A (en) Virtual world avatar control, interactivity and communication interactive messaging
GB2426169A (en) Controlling the respective volume of each of a plurality of loudspeakers
US20090247249A1 (en) Data processing
AU2005201955A1 (en) Multi-sensory emoticons in a communication system
JP2004130003A (en) Game system, program, and information storage medium
US8360856B2 (en) Entertainment apparatus and method
KR100865005B1 (en) Image generation device, automatic generation method, and medium for recording the program
WO2006024873A2 (en) Image rendering
US20100035678A1 (en) Video game
US7980955B2 (en) Method and apparatus for continuous execution of a game program via multiple removable storage mediums
JP4508719B2 (en) Program, information storage medium, and game system
EP1072298A1 (en) Display method for a confrontation type video game for displaying different information to players, storage medium and video game system
WO2008035027A1 (en) Video game
EP1889645A2 (en) Data processing
JP2002113261A (en) Game system, storage medium and entertainment device
JP2024078144A (en) Information processing system, information processing device, and program
YOU et al. Cutting-edge consoles target the television
JP2004089458A (en) Game system, program, and information storage medium

Legal Events

Date Code Title Description
PE20 Patent expired after termination of 20 years

Expiry date: 20240624