DE69908226T2 - Device and method for finding melodies - Google Patents
Device and method for finding melodies Download PDFInfo
- Publication number
- DE69908226T2 DE69908226T2 DE69908226T DE69908226T DE69908226T2 DE 69908226 T2 DE69908226 T2 DE 69908226T2 DE 69908226 T DE69908226 T DE 69908226T DE 69908226 T DE69908226 T DE 69908226T DE 69908226 T2 DE69908226 T2 DE 69908226T2
- Authority
- DE
- Germany
- Prior art keywords
- values
- music
- pitch
- time span
- database
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 238000000034 method Methods 0.000 title claims description 9
- 238000006243 chemical reaction Methods 0.000 claims description 5
- 230000005540 biological transmission Effects 0.000 claims description 2
- 238000004590 computer program Methods 0.000 claims 2
- 230000005236 sound signal Effects 0.000 claims 2
- 238000004513 sizing Methods 0.000 claims 1
- 239000011295 pitch Substances 0.000 description 35
- 230000004044 response Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H1/00—Details of electrophonic musical instruments
- G10H1/0033—Recording/reproducing or transmission of music for electrophonic musical instruments
- G10H1/0041—Recording/reproducing or transmission of music for electrophonic musical instruments in coded form
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/60—Information retrieval; Database structures therefor; File system structures therefor of audio data
- G06F16/63—Querying
- G06F16/632—Query formulation
- G06F16/634—Query by example, e.g. query by humming
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/60—Information retrieval; Database structures therefor; File system structures therefor of audio data
- G06F16/68—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/683—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/121—Musical libraries, i.e. musical databases indexed by musical parameters, wavetables, indexing schemes using musical parameters, musical rule bases or knowledge bases, e.g. for automatic composing methods
- G10H2240/131—Library retrieval, i.e. searching a database or selecting a specific musical piece, segment, pattern, rule or parameter set
- G10H2240/141—Library retrieval matching, i.e. any of the steps of matching an inputted segment or phrase with musical database contents, e.g. query by humming, singing or playing; the steps may include, e.g. musical analysis of the input, musical feature extraction, query formulation, or details of the retrieval process
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/171—Transmission of musical instrument data, control or status information; Transmission, remote access or control of music data for electrophonic musical instruments
- G10H2240/201—Physical layer or hardware aspects of transmission to or from an electrophonic musical instrument, e.g. voltage levels, bit streams, code words or symbols over a physical link connecting network nodes or instruments
- G10H2240/241—Telephone transmission, i.e. using twisted pair telephone lines or any type of telephone network
- G10H2240/251—Mobile telephone transmission, i.e. transmitting, accessing or controlling music data wirelessly via a wireless or mobile telephone receiver, analogue or digital, e.g. DECT, GSM, UMTS
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Library & Information Science (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Mathematical Physics (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Auxiliary Devices For Music (AREA)
- Electrophonic Musical Instruments (AREA)
Description
Die vorliegende Erfindung betrifft ein System und ein Verfahren zum Wiederauffinden von Melodien und insbesondere ein WWW-basiertes System zum Wiederauffinden von Melodien mit unter Verwendung einer Verteilung von Tonhöhen und Zeitspannen von Noten aus einer Musikdatenbank bestimmten Schwellenwerten.The present invention relates to a melody retrieval system and method and in particular a WWW-based system for finding melodies with using a distribution of pitches and time spans of notes threshold values determined from a music database.
Eines der Probleme beim Einrichten eines Melodiewiederauffindungssystems, bei dem eine Melodie durch das Singen eines Benutzers eingegeben wird, besteht darin, daß die eingegebene Melodie verschiedene Fehler aufweisen kann, die durch die Ungewißheit der Erinnerung des Benutzers oder durch das Singvermögen des Benutzers hervorgerufen werden. Diese Fehler werden gewöhnlich durch die Verwendung mehrerer Schwellenwerte und das Umwandeln der eingegebenen Melodie in ungefähre Relativwertsequenzen der Tonhöhenänderung (der Tonhöhendifferenz zwischen benachbarten Noten) und der Zeitspannenänderung (dem Zwischen-Anstiegsintervallverhältnis benachbarter Noten) toleriert.One of the problems with setting up a melody retrieval system in which a melody is the singing of a user is entered is that the entered Melody may have various errors due to the uncertainty of the Reminder of the user or caused by the user's ability to sing become. These mistakes are common by using multiple thresholds and converting the entered melody in approximate relative value sequences the pitch change (the pitch difference between adjacent notes) and the time span change (the intermediate increase interval ratio of adjacent ones Grades) tolerated.
Beispielsweise wird ein ungefährer Relativ-Tonhöhenwert durch die Verwendung von drei Zeichen {U(p), D(own) und E(qual)} ausgedrückt, und die Sequenz "do-re-mi-mi-re-do" wird in "X-U-U-E-D-D-" umgewandelt (wobei "X" die erste Note angibt, die keinen Relativwert aufweist). Diese ungefähren Sequenzen werden dann im Abgleichprozeß verwendet, wobei die Eingabe mit allen Liedern in einer Datenbank verglichen wird.For example, an approximate pitch value by using three characters {U (p), D (own) and E (qual)} expressed and the sequence "do-re-mi-mi-re-do" is converted to "X-U-U-E-D-D-" (where "X" indicates the first note that has no relative value having). These approximate Sequences are then used in the matching process, with the input is compared with all songs in a database.
Frühere Melodiewiederauffindungssysteme [Kageyama u. a., 1993; Ghias u. a., 1995] haben statische, heuristische Schwellenwerte verwendet, um diese ungefähren Sequenzen zu erhalten. Weil es schwierig ist, die optimalen Schwellenwerte für alle Lieder zu bestimmen, haben diese Systeme die Zeitspanneninformationen, die möglicherweise nützlich sein können, nicht ausgenutzt, und sie haben in erster Linie Tonhöheninformationen als einen Suchhinweis verwendet.Previous melody retrieval systems [Kageyama u. a., 1993; Ghias et al. a., 1995] have static, heuristic threshold values used to approximate this To get sequences. Because it is difficult to find the optimal thresholds for all songs to determine, these systems have the time span information, the possibly to be useful can, not exploited and they primarily have pitch information used as a search hint.
Es war daher schwierig, die Abgleichsgenauigkeit nur unter Verwendung der Tonhöheninformationen zu verbessern.It was therefore difficult to balance the accuracy only using the pitch information to improve.
Es war auch schwierig, die Anzahl der Antwortkandidaten zu verringern, weil mehrere Lieder in einer großen Datenbank gewöhnlich die gleichen Muster der ungefähren Relativwerte von Tonhöhen- und Zeitspannensequenzen aufwiesen.It was also difficult to count the answer candidates decrease because multiple songs in one huge Database usually the same pattern of the approximate Relative values of pitch and Showed time span sequences.
Weiterhin wurde die öffentliche Verwendung von Musikdatenbanken über Computernetzwerke, wie das World Wide Web (WWW), nicht berücksichtigt.Furthermore, the public Use of music databases via Computer networks, such as the World Wide Web (WWW), are not taken into account.
Bei herkömmlichen Musikwiederauffindungssystemen werden die Tonhöheninformationen in erster Linie als ein Suchhinweis verwendet, während die Zeitspanneninformationen nicht wirksam verwendet werden, und es ist daher schwierig, die Abgleichsgenauigkeit nur unter Verwendung der Tonhöheninformationen zu verbessern.In conventional music retrieval systems the pitch information in primarily used as a search hint while the time span information cannot be used effectively, and therefore it is difficult to match accuracy only using the pitch information to improve.
Gemäß einem ersten Aspekt der Erfindung ist ein Verfahren zum Wiederauffinden von Melodien nach Anspruch 1 vorgesehen.According to a first aspect of the invention a method for retrieving melodies according to claim 1 is provided.
Gemäß einem zweiten Aspekt der Erfindung ist ein System zum Wiederauffinden von Melodien nach Anspruch 4 vorgesehen.According to a second aspect of Invention is a melody retrieval system according to claim 4 provided.
Bei ihren bevorzugten Aspekten kann
die Erfindung bereitstellen:
ein WWW-basiertes System zum Wiederauffinden von
Melodien, bei dem ein Benutzer eine Melodie durch Singen eingeben
kann,
ein WWW-basiertes System zum Wiederauffinden von Melodien
mit durch die Verteilung von Tonhöhen und Zeitspannen von Noten
aus einer Musikdatenbank genau bestimmten Schwellenwerten,
ein
WWW-basiertes System zum Wiederauffinden von Melodien mit durch
die Verteilung von Tonhöhen und
Zeitspannen von Noten, die aus einer Musikdatenbank extrahiert wurden,
bestimmten Schwellenwerten, wodurch ein genauer Liedtitel schnell
erhalten werden kann,
ein WWW-basiertes System zum Wiederauffinden von
Melodien mit durch die Verteilung von Tonhöhen und Zeitspannen von Noten
aus einer Musikdatenbank bestimmten Schwellenwerten, wobei ungefähre Relativwertsequenzen
der Tonhöhe
und der Zeitspanne mit der maximalen Informationsmenge anhand einer
Musikdatenbank und der vom Benutzer eingegebenen Melodie dynamisch
erhalten werden können,
ein
WWW-basiertes System zum Wiederauffinden von Melodien, das Relativ-Tonhöhen- oder
Relativ-Zeitspanneninformationen von Noten durch Kategorisieren
in ungefähre
Relativwerte umwandeln kann, wobei die ungefähren Werte jeder Kategorie mit
gleicher Frequenz in einer Datenbank erscheinen,
ein WWW-basiertes
System zum Wiederauffinden von Melodien, das eine beliebige Anzahl
von Kategorien zum Umwandeln von Relativ-Tonhöhen- oder Relativ-Zeitspanneninformationen
von Noten in ungefähre
Relativwerte bestimmen kann,
ein WWW-basiertes System zum Wiederauffinden von
Melodien unter Verwendung des Grob-zu-Fein-Abgleichs, wobei die
Anzahl der Antwortkandidaten unter Berücksichtigung des Kompromisses
zwischen dem Grobabgleich und dem Feinabgleich verkleinert werden
kann,
ein WWW-basiertes System zum Wiederauffinden von Melodien,
das eine gesungene Melodie als eine Abfrage verwendet und das zuverlässig ist,
so daß es robust
genug ist, um mit anonymen Benutzereingaben fertigzuwerden,
eine
WWW-basierte Vorrichtung zum Wiederauffinden von Melodien, die entweder
an einer gewünschten
Position oder an einem gewünschten
Artikel angebracht werden kann, wobei eine Melodiewiederauffindung
leicht anhand der fernen Musikdatenbanken über ein WWW-Netzwerk ausgeführt werden kann,
eine
WWW-basierte Vorrichtung zum Wiederauffinden von Melodien, die in
ein Auto eingesetzt werden kann, wobei ein Autofahrer Musik durch
Singen wiederauffinden kann, wodurch es dem Autofahrer ermöglicht wird,
weiterhin sicher zu fahren, weil es nicht erforderlich ist, daß er seine
Hände zum
Wiederauffinden oder Auswählen
von Musik verwendet,
eine WWW-basierte Vorrichtung zum Wiederauffinden
von Melodien, die in einen Karaoke-Spieler eingesetzt werden kann,
wobei ein Benutzer Lieder durch Singen Wiederauffinden kann,
eine
WWW-basierte Vorrichtung zum Wiederauffinden von Melodien, die in
einem Plattenladen aufgestellt werden kann, wobei ein Benutzer Musik
durch Singen über
das Netzwerk auffinden kann, selbst wenn der Plattenladen die vom
Benutzer gewünschten
Musik-CDs nicht führt,
eine
WWW-basierte Vorrichtung zum Wiederauffinden von Melodien, die entweder
in ein Mobiltelefon oder eine tragbare Vorrichtung eingesetzt werden kann,
wobei ein Benutzer Musik überall
Wiederauffinden kann,
eine WWW-basierte Vorrichtung zum Wiederauffinden
von Melodien, an der mehrere Benutzer ihre gewünschten Melodien durch gleichzeitiges
Singen eingeben können,
und
eine Vorrichtung zum Wiederauffinden des Liedtitels aus
einer Musikdatenbank über
ein WWW-Netzwerk, wobei die Abgleichsgenauigkeit einer Musikwiederauffindung
für Melodiewiederauffindungsdienste über ein
WWW-Netzwerk hoch genug ist.In its preferred aspects, the invention can provide:
a WWW-based melody retrieval system in which a user can input a melody by singing,
a WWW-based system for retrieving melodies with threshold values precisely determined by the distribution of pitches and time spans of notes from a music database,
a WWW-based melody retrieval system with thresholds determined by the distribution of pitches and time periods of notes extracted from a music database, whereby an accurate song title can be obtained quickly,
a WWW-based system for retrieving melodies with threshold values determined by the distribution of pitches and time periods of notes from a music database, whereby approximate relative value sequences of the pitch and time period with the maximum amount of information can be obtained dynamically from a music database and the melody entered by the user .
a WWW-based melody retrieval system that can convert relative pitch or relative time span information of notes into approximate relative values by categorizing, with the approximate values of each category appearing in a database at the same frequency,
a WWW-based melody retrieval system that can determine any number of categories for converting relative pitch or relative time span information from notes into approximate relative values,
a WWW-based system for finding melodies using the coarse-to-fine comparison, the number of response candidates can be reduced taking into account the compromise between the coarse comparison and the fine comparison,
a WWW-based melody retrieval system that uses a sung melody as a query and is reliable so that it is robust enough to deal with anonymous user input,
a WWW-based melody retrieval device that can be placed either in a desired location or on a desired item, with melody retrieval easily based on the remote music database can be executed over a WWW network,
a WWW-based melody retrieval device that can be inserted into a car, wherein a driver can find music by singing, thereby enabling the driver to continue driving safely because it does not require his hands to be Retrieving or selecting music used
a WWW-based melody retrieval device that can be inserted into a karaoke player, whereby a user can retrieve songs by singing,
a WWW-based melody retrieval device that can be placed in a record store, whereby a user can find music by singing over the network even if the record store does not carry the music CDs the user desires,
a WWW-based melody retrieval device that can be inserted into either a cell phone or a portable device where a user can retrieve music anywhere,
a WWW-based melody retrieval device where multiple users can input their desired melodies by singing simultaneously, and
a device for retrieving the song title from a music database via a WWW network, the matching accuracy of a music retrieval for melody retrieval services via a WWW network being high enough.
Bei einem weiteren bevorzugten Aspekt der Erfindung ist ein System zum Wiederauffinden von Medien vorgesehen, das Musik aufweisende Medien durch Melodien wiederauffinden kann.In another preferred aspect the invention provides a system for retrieving media, that media containing music can be found again through melodies.
Eine bevorzugte Ausführungsform der Erfindung wird nun nur als Beispiel mit Bezug auf die anliegende Zeichnung beschrieben.A preferred embodiment the invention will now be given by way of example only with reference to the accompanying Drawing described.
Mit Bezug auf die anliegende Zeichnung,
in der gleiche Bezugszahlen in den verschiedenen Ansichten gleiche
Teile bezeichnen, sei bemerkt, daß in
Als Vorbereitung für das Wiederauffinden werden Sequenzen der Relativ-Tonhöhendifferenz und des Relativ-Zeitspannen verhältnisses zuerst anhand Sequenzen von Tonhöhen- und Zeitspannenwerten jedes Musikstücks in einer Datenbank erhalten (S101), und es werden ein Histogramm der Relativ-Tonhöhenwerte und ein Histogramm der Relativ-Zeitspannenwerte der Datenbank unter Verwendung der Verteilung aller Noten in der Datenbank gebildet (S102).In preparation for the rediscovery Sequences of the relative pitch difference and of the relative time span ratio first using sequences of pitch and obtain time span values of each piece of music in a database (S101), and there will be a histogram of the relative pitch values and a histogram of the relative time span values of the database below Using the distribution of all grades formed in the database (S102).
In
In
Beim Bilden dieser Histogramme wird die Gesamtfrequenz der Werte eines Histogramms der Tonhöhe durch Sum1 ausgedrückt und diejenige eines Histogramms der Zeitspanne durch Sum2 ausgedrückt. Die Anzahl der Kategorien für das Histogramm der Relativ-Tonhöhenwerte wird durch Kategorie_Num1 ausgedrückt, und diejenige der Relativ-Zeitspannenwerte wird durch Kategorie_Num2 ausgedrückt.When forming these histograms the total frequency of the values of a histogram of the pitch Expressed Sum1 and that of a histogram of the time period expressed by Sum2. The Number of categories for the histogram of the relative pitch values is expressed by Category_Num1, and that of the relative time span values is expressed by Category_Num2.
In diesem Fall werden Schwellenwerte
für Relativ-Tonhöhenwerte
so bestimmt, daß jede
Kategorie, die durch die Schwellenwerte geteilt wird, gleichmäßig M1 (=
Sum1/Kategorie_Num1) Werte in sich hat. Schwellenwerte für Relativ-Zeitspannenwerte
werden ähnlich
bestimmt, so daß jede
Kategorie gleichmäßig M2 (=
Sum2/Kategorie_Num2) Werte in sich hat (S103). Nachfolgend werden
die Relativ-Tonhöhendifferenz
und das Relativ-Zeitspannenverhältnis
jeweils in die ungefähren
Relativ-Tonhöhenwerte
bzw. Relativ-Zeitspannenwerte konvertiert, indem diese Werte in
mehrere Teile, wie U, E, D in
In den
Ein Lied wird als Vorbereitung für das Wiederauffinden
dadurch eingegeben, daß ein
Benutzer in ein Mikrofon
Eine Anstiegszeit des stimmhaften Klangs wird als eine Anstiegszeit jeder Note abgetrennt, es wird eine Zeitdifferenz (Anzahl der Rahmen) mit einer Anstiegszeit der nächsten Note als Zeitspanne der Note bestimmt, und es wird der maximale Wert unter den Grundfrequenzen jeder während ihrer Zeitspanne enthaltenen Note als der höchste Tonhöhenwert definiert (S109).A rise time of the voiced Sound is separated as a rise time of each note, it will a time difference (number of frames) with a rise time of next Note is determined as the time span of the note, and it becomes the maximum Value below the fundamental frequencies each contained during its period Grade as the highest Pitch value defined (S109).
Die Relativ-Tonhöhen- und -Zeitspannenwerte (S110) jeder Note in dem eingegebenen Lied werden anhand der Tonhöhe und der Zeitspanne der vorhergehenden Note berechnet. Die Relativwerte werden dann zu einem Liedwiederauffindungssystem A übertragen, wobei dieses Übertragen über das WWW-Netzwerk erfolgen kann.The relative pitch and span values (S110) of each note in the input song are determined by the pitch and the Time span of the previous note calculated. The relative values are then transmitted to a song retrieval system A, this transmission being done over the WWW network can.
Wie in den
Wie insbesondere in
Es ist anhand
Von dem Liedwiederauffindungssystem
A wird das Auffindungsergebnis empfangen, das auf der Anzeige
Wie in den vorstehenden Abschnitten
beschrieben wurde, kann der Titel des gewünschten Lieds durch Eingeben
der Melodie durch Singen in das Mikrofon
Claims (10)
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP10111273A JPH11272274A (en) | 1998-03-19 | 1998-03-19 | Method for retrieving piece of music by use of singing voice |
JP11127398 | 1998-03-19 | ||
JP37808498 | 1998-12-21 | ||
JP10378084A JP2000187671A (en) | 1998-12-21 | 1998-12-21 | Music retrieval system with singing voice using network and singing voice input terminal equipment to be used at the time of retrieval |
Publications (2)
Publication Number | Publication Date |
---|---|
DE69908226D1 DE69908226D1 (en) | 2003-07-03 |
DE69908226T2 true DE69908226T2 (en) | 2004-03-25 |
Family
ID=26450702
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
DE69908226T Expired - Fee Related DE69908226T2 (en) | 1998-03-19 | 1999-03-19 | Device and method for finding melodies |
Country Status (3)
Country | Link |
---|---|
US (1) | US6121530A (en) |
EP (1) | EP0944033B1 (en) |
DE (1) | DE69908226T2 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9053696B2 (en) | 2010-12-01 | 2015-06-09 | Yamaha Corporation | Searching for a tone data set based on a degree of similarity to a rhythm pattern |
Families Citing this family (110)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6345104B1 (en) | 1994-03-17 | 2002-02-05 | Digimarc Corporation | Digital watermarks and methods for security documents |
US7313251B2 (en) | 1993-11-18 | 2007-12-25 | Digimarc Corporation | Method and system for managing and controlling electronic media |
US8094949B1 (en) | 1994-10-21 | 2012-01-10 | Digimarc Corporation | Music methods and systems |
US6560349B1 (en) | 1994-10-21 | 2003-05-06 | Digimarc Corporation | Audio monitoring using steganographic information |
US6760463B2 (en) | 1995-05-08 | 2004-07-06 | Digimarc Corporation | Watermarking methods and media |
US7805500B2 (en) | 1995-05-08 | 2010-09-28 | Digimarc Corporation | Network linking methods and apparatus |
US7224819B2 (en) | 1995-05-08 | 2007-05-29 | Digimarc Corporation | Integrating digital watermarks in multimedia content |
US6505160B1 (en) * | 1995-07-27 | 2003-01-07 | Digimarc Corporation | Connected audio and other media objects |
US6829368B2 (en) | 2000-01-26 | 2004-12-07 | Digimarc Corporation | Establishing and interacting with on-line media collections using identifiers in media signals |
US6411725B1 (en) | 1995-07-27 | 2002-06-25 | Digimarc Corporation | Watermark enabled video objects |
US8429205B2 (en) | 1995-07-27 | 2013-04-23 | Digimarc Corporation | Associating data with media signals in media signal systems through auxiliary data steganographically embedded in the media signals |
US7562392B1 (en) | 1999-05-19 | 2009-07-14 | Digimarc Corporation | Methods of interacting with audio and ambient music |
US7711564B2 (en) | 1995-07-27 | 2010-05-04 | Digimarc Corporation | Connected audio and other media objects |
US6965682B1 (en) | 1999-05-19 | 2005-11-15 | Digimarc Corp | Data transmission by watermark proxy |
US7505605B2 (en) | 1996-04-25 | 2009-03-17 | Digimarc Corporation | Portable devices and methods employing digital watermarking |
US20030056103A1 (en) | 2000-12-18 | 2003-03-20 | Levy Kenneth L. | Audio/video commerce application architectural framework |
US8180844B1 (en) | 2000-03-18 | 2012-05-15 | Digimarc Corporation | System for linking from objects to remote resources |
US7930546B2 (en) * | 1996-05-16 | 2011-04-19 | Digimarc Corporation | Methods, systems, and sub-combinations useful in media identification |
US7689532B1 (en) | 2000-07-20 | 2010-03-30 | Digimarc Corporation | Using embedded data with file sharing |
US8332478B2 (en) | 1998-10-01 | 2012-12-11 | Digimarc Corporation | Context sensitive connected content |
US20070055884A1 (en) | 1999-05-19 | 2007-03-08 | Rhoads Geoffrey B | User control and activation of watermark enabled objects |
US8752118B1 (en) | 1999-05-19 | 2014-06-10 | Digimarc Corporation | Audio and video content-based methods |
US8095796B2 (en) | 1999-05-19 | 2012-01-10 | Digimarc Corporation | Content identifiers |
US8055588B2 (en) * | 1999-05-19 | 2011-11-08 | Digimarc Corporation | Digital media methods |
US7760905B2 (en) * | 1999-06-29 | 2010-07-20 | Digimarc Corporation | Wireless mobile phone with content processing |
US7406214B2 (en) * | 1999-05-19 | 2008-07-29 | Digimarc Corporation | Methods and devices employing optical sensors and/or steganography |
US8874244B2 (en) | 1999-05-19 | 2014-10-28 | Digimarc Corporation | Methods and systems employing digital content |
US7185201B2 (en) * | 1999-05-19 | 2007-02-27 | Digimarc Corporation | Content identifiers triggering corresponding responses |
US20020032734A1 (en) | 2000-07-26 | 2002-03-14 | Rhoads Geoffrey B. | Collateral data combined with user characteristics to select web site |
US6355869B1 (en) * | 1999-08-19 | 2002-03-12 | Duane Mitton | Method and system for creating musical scores from musical recordings |
US6868394B1 (en) * | 1999-09-21 | 2005-03-15 | Daniel Mele | Method and apparatus for simplified artist-buyer transactions |
US7194752B1 (en) | 1999-10-19 | 2007-03-20 | Iceberg Industries, Llc | Method and apparatus for automatically recognizing input audio and/or video streams |
DE19948974A1 (en) * | 1999-10-11 | 2001-04-12 | Nokia Mobile Phones Ltd | Method for recognizing and selecting a tone sequence, in particular a piece of music |
US6188010B1 (en) * | 1999-10-29 | 2001-02-13 | Sony Corporation | Music search by melody input |
US7224995B2 (en) * | 1999-11-03 | 2007-05-29 | Digimarc Corporation | Data entry method and system |
US7257536B1 (en) * | 1999-11-23 | 2007-08-14 | Radiant Systems, Inc. | Audio request interaction system |
US6678680B1 (en) * | 2000-01-06 | 2004-01-13 | Mark Woo | Music search engine |
KR100865247B1 (en) | 2000-01-13 | 2008-10-27 | 디지맥 코포레이션 | Authenticating metadata and embedding metadata in watermarks of media signals |
US7444353B1 (en) | 2000-01-31 | 2008-10-28 | Chen Alexander C | Apparatus for delivering music and information |
US20020073098A1 (en) * | 2000-02-28 | 2002-06-13 | Lei Zhang | Methodology and system for searching music over computer network and the internet based on melody and rhythm input |
DE10011297C2 (en) * | 2000-03-08 | 2002-03-07 | Ingolf Ruge | Procedure for creating and transferring a request to a database |
US20070163425A1 (en) * | 2000-03-13 | 2007-07-19 | Tsui Chi-Ying | Melody retrieval system |
AU2001251347A1 (en) * | 2000-04-05 | 2001-10-23 | Aei Music Network, Inc. | Expert system for play list generation |
US8121843B2 (en) | 2000-05-02 | 2012-02-21 | Digimarc Corporation | Fingerprint methods and systems for media signals |
US6970886B1 (en) * | 2000-05-25 | 2005-11-29 | Digimarc Corporation | Consumer driven methods for associating content indentifiers with related web addresses |
US7853664B1 (en) * | 2000-07-31 | 2010-12-14 | Landmark Digital Services Llc | Method and system for purchasing pre-recorded music |
MXPA02003991A (en) * | 2000-08-23 | 2002-12-13 | Koninkl Philips Electronics Nv | Method of enhancing rendering of a content item, client system and server system. |
US8205237B2 (en) | 2000-09-14 | 2012-06-19 | Cox Ingemar J | Identifying works, using a sub-linear time search, such as an approximate nearest neighbor search, for initiating a work-based action, such as an action on the internet |
FI20002161A (en) | 2000-09-29 | 2002-03-30 | Nokia Mobile Phones Ltd | Method and system for recognizing a melody |
WO2002035516A1 (en) * | 2000-10-23 | 2002-05-02 | Ntt Communications Corporation | Musical composition recognition method and system, storage medium where musical composition program is stored, commercial recognition method and system, and storage medium where commercial recognition program is stored |
FR2815760B1 (en) * | 2000-10-24 | 2003-01-24 | Philippe Ecrement | METHOD FOR CONSTITUTING A DIGITAL SIGNAL REPRESENTATIVE OF A SOUND SIGNAL AND METHOD FOR RECOGNIZING A SOUND SIGNAL |
WO2002056139A2 (en) | 2000-10-26 | 2002-07-18 | Digimarc Corporation | Method and system for internet access |
DE10058811A1 (en) | 2000-11-27 | 2002-06-13 | Philips Corp Intellectual Pty | Method for identifying pieces of music e.g. for discotheques, department stores etc., involves determining agreement of melodies and/or lyrics with music pieces known by analysis device |
US20020072982A1 (en) | 2000-12-12 | 2002-06-13 | Shazam Entertainment Ltd. | Method and system for interacting with a user in an experiential environment |
US8055899B2 (en) | 2000-12-18 | 2011-11-08 | Digimarc Corporation | Systems and methods using digital watermarking and identifier extraction to provide promotional opportunities |
WO2002051063A1 (en) | 2000-12-21 | 2002-06-27 | Digimarc Corporation | Methods, apparatus and programs for generating and utilizing content signatures |
US8103877B2 (en) | 2000-12-21 | 2012-01-24 | Digimarc Corporation | Content identification and electronic tickets, coupons and credits |
JP2004534274A (en) * | 2001-03-23 | 2004-11-11 | インスティチュート・フォー・インフォコム・リサーチ | Method and system for displaying music information on a digital display for use in content-based multimedia information retrieval |
US7248715B2 (en) * | 2001-04-06 | 2007-07-24 | Digimarc Corporation | Digitally watermarking physical media |
DE10117870B4 (en) * | 2001-04-10 | 2005-06-09 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Method and apparatus for transferring a music signal into a score-based description and method and apparatus for referencing a music signal in a database |
US8457346B2 (en) | 2001-04-24 | 2013-06-04 | Digimarc Corporation | Digital watermarking image signals on-chip |
US7046819B2 (en) | 2001-04-25 | 2006-05-16 | Digimarc Corporation | Encoded reference signal for digital watermarks |
DE10144087B4 (en) * | 2001-09-08 | 2008-10-30 | Promediascan Ag | Method for detecting and registering copyrighted music sequences in radio and television programs |
JP2005505008A (en) * | 2001-09-28 | 2005-02-17 | テレフオンアクチーボラゲツト エル エム エリクソン(パブル) | Method and system for performing a karaoke function |
ATE363118T1 (en) * | 2001-09-28 | 2007-06-15 | Ericsson Telefon Ab L M | ELECTRONIC CONNECTION DEVICE WITH KARAOKE FUNCTION |
US6724914B2 (en) | 2001-10-16 | 2004-04-20 | Digimarc Corporation | Progressive watermark decoding on a distributed computing platform |
US6528715B1 (en) * | 2001-10-31 | 2003-03-04 | Hewlett-Packard Company | Music search by interactive graphical specification with audio feedback |
US6995309B2 (en) * | 2001-12-06 | 2006-02-07 | Hewlett-Packard Development Company, L.P. | System and method for music identification |
US20030140139A1 (en) * | 2002-01-14 | 2003-07-24 | Richard Marejka | Self-monitoring and trending service system with a cascaded pipeline with a unique data storage and retrieval structures |
US7321667B2 (en) | 2002-01-18 | 2008-01-22 | Digimarc Corporation | Data hiding through arrangement of objects |
US7824029B2 (en) | 2002-05-10 | 2010-11-02 | L-1 Secure Credentialing, Inc. | Identification card printer-assembler for over the counter card issuing |
DE50214167D1 (en) * | 2002-07-10 | 2010-03-04 | Palm Inc | Method for finding a tone sequence |
CN1703734A (en) * | 2002-10-11 | 2005-11-30 | 松下电器产业株式会社 | Method and apparatus for determining musical notes from sounds |
US7606790B2 (en) * | 2003-03-03 | 2009-10-20 | Digimarc Corporation | Integrating and enhancing searching of media content and biometric databases |
AU2003304560A1 (en) * | 2003-11-21 | 2005-06-08 | Agency For Science, Technology And Research | Method and apparatus for melody representation and matching for music retrieval |
US20050216512A1 (en) * | 2004-03-26 | 2005-09-29 | Rahav Dor | Method of accessing a work of art, a product, or other tangible or intangible objects without knowing the title or name thereof using fractional sampling of the work of art or object |
CN101032106B (en) | 2004-08-06 | 2014-07-23 | 数字标记公司 | Fast signal detection and distributed computing in portable computing devices |
KR20060073100A (en) * | 2004-12-24 | 2006-06-28 | 삼성전자주식회사 | Sound searching terminal of searching sound media's pattern type and the method |
CA2628061A1 (en) * | 2005-11-10 | 2007-05-24 | Melodis Corporation | System and method for storing and retrieving non-text-based information |
US20070162761A1 (en) | 2005-12-23 | 2007-07-12 | Davis Bruce L | Methods and Systems to Help Detect Identity Fraud |
US8738749B2 (en) | 2006-08-29 | 2014-05-27 | Digimarc Corporation | Content monitoring and host compliance evaluation |
US8707459B2 (en) | 2007-01-19 | 2014-04-22 | Digimarc Corporation | Determination of originality of content |
US8010511B2 (en) | 2006-08-29 | 2011-08-30 | Attributor Corporation | Content monitoring and compliance enforcement |
US7378588B1 (en) | 2006-09-12 | 2008-05-27 | Chieh Changfan | Melody-based music search |
WO2008095190A2 (en) | 2007-02-01 | 2008-08-07 | Museami, Inc. | Music transcription |
WO2008101130A2 (en) * | 2007-02-14 | 2008-08-21 | Museami, Inc. | Music-based search engine |
US8283546B2 (en) * | 2007-03-28 | 2012-10-09 | Van Os Jan L | Melody encoding and searching system |
US7840177B2 (en) * | 2007-05-23 | 2010-11-23 | Landmark Digital Services, Llc | Device for monitoring multiple broadcast signals |
US8494257B2 (en) | 2008-02-13 | 2013-07-23 | Museami, Inc. | Music score deconstruction |
US8805110B2 (en) | 2008-08-19 | 2014-08-12 | Digimarc Corporation | Methods and systems for content processing |
US20100205628A1 (en) | 2009-02-12 | 2010-08-12 | Davis Bruce L | Media processing methods and arrangements |
US9390167B2 (en) | 2010-07-29 | 2016-07-12 | Soundhound, Inc. | System and methods for continuous audio matching |
US8049093B2 (en) * | 2009-12-30 | 2011-11-01 | Motorola Solutions, Inc. | Method and apparatus for best matching an audible query to a set of audible targets |
CA2943957C (en) * | 2010-05-04 | 2017-10-03 | Avery Li-Chun Wang | Methods and systems for synchronizing media |
US9047371B2 (en) | 2010-07-29 | 2015-06-02 | Soundhound, Inc. | System and method for matching a query against a broadcast stream |
US11062615B1 (en) | 2011-03-01 | 2021-07-13 | Intelligibility Training LLC | Methods and systems for remote language learning in a pandemic-aware world |
US10019995B1 (en) | 2011-03-01 | 2018-07-10 | Alice J. Stiebel | Methods and systems for language learning based on a series of pitch patterns |
US9035163B1 (en) | 2011-05-10 | 2015-05-19 | Soundbound, Inc. | System and method for targeting content based on identified audio and multimedia |
CN102522083B (en) * | 2011-11-29 | 2014-03-05 | 北京百纳威尔科技有限公司 | Method for searching hummed song by using mobile terminal and mobile terminal thereof |
US8832723B2 (en) | 2012-02-07 | 2014-09-09 | Turner Broadcasting System, Inc. | Method and system for a synchronous event manager for automatic content recognition |
US10957310B1 (en) | 2012-07-23 | 2021-03-23 | Soundhound, Inc. | Integrated programming framework for speech and text understanding with meaning parsing |
US9167276B2 (en) | 2012-12-28 | 2015-10-20 | Turner Broadcasting System, Inc. | Method and system for providing and handling product and service discounts, and location based services (LBS) in an automatic content recognition based system |
US9507849B2 (en) | 2013-11-28 | 2016-11-29 | Soundhound, Inc. | Method for combining a query and a communication command in a natural language computer system |
US9292488B2 (en) | 2014-02-01 | 2016-03-22 | Soundhound, Inc. | Method for embedding voice mail in a spoken utterance using a natural language processing computer system |
US11295730B1 (en) | 2014-02-27 | 2022-04-05 | Soundhound, Inc. | Using phonetic variants in a local context to improve natural language understanding |
US9564123B1 (en) | 2014-05-12 | 2017-02-07 | Soundhound, Inc. | Method and system for building an integrated user profile |
JP6245650B2 (en) * | 2014-08-26 | 2017-12-13 | 株式会社エクシング | Music search program |
WO2017028116A1 (en) * | 2015-08-16 | 2017-02-23 | 胡丹丽 | Intelligent desktop speaker and method for controlling intelligent desktop speaker |
US10701438B2 (en) | 2016-12-31 | 2020-06-30 | Turner Broadcasting System, Inc. | Automatic content recognition and verification in a broadcast chain |
CN109036463B (en) * | 2018-09-13 | 2021-02-12 | 广州酷狗计算机科技有限公司 | Method, device and storage medium for acquiring difficulty information of songs |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2797644B2 (en) * | 1990-05-02 | 1998-09-17 | ブラザー工業株式会社 | Karaoke device with vocalization function |
JP2897659B2 (en) * | 1994-10-31 | 1999-05-31 | ヤマハ株式会社 | Karaoke equipment |
EP0731446B1 (en) * | 1995-03-08 | 2001-07-04 | GENERALMUSIC S.p.A. | A microprocessor device for selection and recognition of musical pieces |
US5616876A (en) * | 1995-04-19 | 1997-04-01 | Microsoft Corporation | System and methods for selecting music on the basis of subjective content |
JP3087602B2 (en) * | 1995-05-02 | 2000-09-11 | ヤマハ株式会社 | Communication karaoke system |
US5732216A (en) * | 1996-10-02 | 1998-03-24 | Internet Angles, Inc. | Audio message exchange system |
DE19652225A1 (en) * | 1996-12-16 | 1998-06-25 | Harald Rieck | Process for automatic identification of melodies |
-
1999
- 1999-03-19 EP EP99302171A patent/EP0944033B1/en not_active Expired - Lifetime
- 1999-03-19 DE DE69908226T patent/DE69908226T2/en not_active Expired - Fee Related
- 1999-03-19 US US09/272,211 patent/US6121530A/en not_active Expired - Fee Related
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9053696B2 (en) | 2010-12-01 | 2015-06-09 | Yamaha Corporation | Searching for a tone data set based on a degree of similarity to a rhythm pattern |
Also Published As
Publication number | Publication date |
---|---|
EP0944033A1 (en) | 1999-09-22 |
EP0944033B1 (en) | 2003-05-28 |
US6121530A (en) | 2000-09-19 |
DE69908226D1 (en) | 2003-07-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE69908226T2 (en) | Device and method for finding melodies | |
DE60120417T2 (en) | METHOD FOR SEARCHING IN AN AUDIO DATABASE | |
DE112007001774B4 (en) | Method and system for searching music | |
DE60037119T3 (en) | ELECTRONIC STORAGE OF MUSIC DATA AND PROGRAMS, WITH THE DETECTION OF PROGRAM SEGMENTS, SUCH AS MUSIC LECTURES RECORDED, AND SYSTEM FOR THE MANAGEMENT AND PLAYING OF SUCH PROGRAM SEGMENTS | |
EP1794745B1 (en) | Device and method for changing the segmentation of an audio piece | |
DE69122017T2 (en) | METHOD AND DEVICE FOR DETECTING SIGNALS | |
DE69926481T2 (en) | DEVICE AND METHOD FOR RECORDING, DESIGNING AND PLAYING SYNCHRONIZED AUDIO AND VIDEO DATA USING VOICE RECOGNITION AND ROTARY BOOKS | |
DE60302420T2 (en) | Music searching device and method | |
DE69124360T2 (en) | Device for displaying vocal characteristics | |
DE69732195T2 (en) | A karaoke device having a server and a plurality of terminals for playing karaoke music | |
EP1093109A1 (en) | Method for recognizing and selecting a note sequence, in particular a musical piece | |
DE60303993T2 (en) | Music structure recognition device and method | |
DE69724919T2 (en) | Process for generating musical tones | |
DE10232916A1 (en) | Device and method for characterizing an information signal | |
EP1217603A1 (en) | Method for identifying musical pieces | |
DE60315880T2 (en) | DATA GENERATION APPARATUS AND METHOD FOR MUSIC COMPOSITIONS | |
DE10117870A1 (en) | Method and device for converting a music signal into a note-based description and method and device for referencing a music signal in a database | |
DE60319710T2 (en) | Method and apparatus for automatic dissection segmented audio signals | |
DE102004028693B4 (en) | Apparatus and method for determining a chord type underlying a test signal | |
DE60318450T2 (en) | Apparatus and method for segmentation of audio data in meta-patterns | |
DE3230713A1 (en) | CASSETTE TAPE DEVICE | |
DE3047801A1 (en) | ELECTRONIC MUSIC INSTRUMENT WITH KEYPAD | |
EP1377924B1 (en) | Method and device for extracting a signal identifier, method and device for creating a database from signal identifiers and method and device for referencing a search time signal | |
EP1671315B1 (en) | Process and device for characterising an audio signal | |
EP1939768A2 (en) | Method and device for selecting characterisable data records |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
8364 | No opposition during term of opposition | ||
8339 | Ceased/non-payment of the annual fee |