US8138945B2 - Sensor node - Google Patents
Sensor node Download PDFInfo
- Publication number
- US8138945B2 US8138945B2 US12/153,689 US15368908A US8138945B2 US 8138945 B2 US8138945 B2 US 8138945B2 US 15368908 A US15368908 A US 15368908A US 8138945 B2 US8138945 B2 US 8138945B2
- Authority
- US
- United States
- Prior art keywords
- sensor node
- name
- data
- type sensor
- controller
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related, expires
Links
- 238000004891 communication Methods 0.000 claims abstract description 155
- 230000001133 acceleration Effects 0.000 claims description 79
- 230000015654 memory Effects 0.000 claims description 69
- 230000005484 gravity Effects 0.000 claims description 6
- 238000005259 measurement Methods 0.000 claims 2
- 230000005540 biological transmission Effects 0.000 abstract description 56
- 230000006835 compression Effects 0.000 abstract description 2
- 238000007906 compression Methods 0.000 abstract description 2
- 238000004458 analytical method Methods 0.000 description 59
- 238000000034 method Methods 0.000 description 49
- 230000000694 effects Effects 0.000 description 38
- 230000008569 process Effects 0.000 description 32
- 238000004364 calculation method Methods 0.000 description 30
- 230000008520 organization Effects 0.000 description 29
- 238000001514 detection method Methods 0.000 description 22
- 230000006870 function Effects 0.000 description 20
- 238000007726 management method Methods 0.000 description 18
- 238000012546 transfer Methods 0.000 description 12
- 101150104728 GPR88 gene Proteins 0.000 description 10
- 102100038404 Probable G-protein coupled receptor 88 Human genes 0.000 description 10
- 238000009826 distribution Methods 0.000 description 10
- 230000033001 locomotion Effects 0.000 description 10
- 238000011156 evaluation Methods 0.000 description 8
- 239000004973 liquid crystal related substance Substances 0.000 description 8
- 230000008859 change Effects 0.000 description 7
- 238000010586 diagram Methods 0.000 description 7
- 230000004044 response Effects 0.000 description 7
- 238000013523 data management Methods 0.000 description 6
- 238000000605 extraction Methods 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 230000005855 radiation Effects 0.000 description 6
- 238000012937 correction Methods 0.000 description 5
- 230000035945 sensitivity Effects 0.000 description 5
- 230000007704 transition Effects 0.000 description 5
- 208000034012 Acid sphingomyelinase deficiency Diseases 0.000 description 4
- 102100027140 Butyrophilin subfamily 1 member A1 Human genes 0.000 description 4
- 102100022443 CXADR-like membrane protein Human genes 0.000 description 4
- 101100426973 Caenorhabditis elegans ttr-3 gene Proteins 0.000 description 4
- 101000901683 Homo sapiens Battenin Proteins 0.000 description 4
- 101000984929 Homo sapiens Butyrophilin subfamily 1 member A1 Proteins 0.000 description 4
- 101000901723 Homo sapiens CXADR-like membrane protein Proteins 0.000 description 4
- 101000595674 Homo sapiens Pituitary homeobox 3 Proteins 0.000 description 4
- 102100036088 Pituitary homeobox 3 Human genes 0.000 description 4
- 101100480850 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) TDA3 gene Proteins 0.000 description 4
- 208000026753 anterior segment dysgenesis Diseases 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 4
- 238000013480 data collection Methods 0.000 description 4
- 238000013500 data storage Methods 0.000 description 4
- 230000036541 health Effects 0.000 description 4
- DSGKWFGEUBCEIE-UHFFFAOYSA-N (2-carbonochloridoylphenyl) acetate Chemical compound CC(=O)OC1=CC=CC=C1C(Cl)=O DSGKWFGEUBCEIE-UHFFFAOYSA-N 0.000 description 3
- LTDQGCFMTVHZKP-UHFFFAOYSA-N (4-bromophenyl)-(4,6-dimethoxy-3-methyl-1-benzofuran-2-yl)methanone Chemical compound O1C2=CC(OC)=CC(OC)=C2C(C)=C1C(=O)C1=CC=C(Br)C=C1 LTDQGCFMTVHZKP-UHFFFAOYSA-N 0.000 description 3
- 101100181417 Arabidopsis thaliana LAZ5 gene Proteins 0.000 description 3
- 208000012273 Grayson-Wilbrandt corneal dystrophy Diseases 0.000 description 3
- 101100437920 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) BTN2 gene Proteins 0.000 description 3
- 101100283245 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) GRX2 gene Proteins 0.000 description 3
- 101100422768 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) SUL2 gene Proteins 0.000 description 3
- 101100263527 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) VCX1 gene Proteins 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- 230000002441 reversible effect Effects 0.000 description 3
- 238000005070 sampling Methods 0.000 description 3
- 101100538857 Caenorhabditis elegans ttr-5 gene Proteins 0.000 description 2
- 102100029968 Calreticulin Human genes 0.000 description 2
- 101100326671 Homo sapiens CALR gene Proteins 0.000 description 2
- 101000859935 Homo sapiens Protein CREG1 Proteins 0.000 description 2
- 101000658739 Homo sapiens Tetraspanin-2 Proteins 0.000 description 2
- 101000727772 Homo sapiens Thiamine transporter 1 Proteins 0.000 description 2
- 102100027796 Protein CREG1 Human genes 0.000 description 2
- 101150013977 TSN1 gene Proteins 0.000 description 2
- 102100035873 Tetraspanin-2 Human genes 0.000 description 2
- 102100030104 Thiamine transporter 1 Human genes 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 230000003340 mental effect Effects 0.000 description 2
- 230000006996 mental state Effects 0.000 description 2
- 230000008450 motivation Effects 0.000 description 2
- 238000003825 pressing Methods 0.000 description 2
- 230000033764 rhythmic process Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 208000006446 thiamine-responsive megaloblastic anemia syndrome Diseases 0.000 description 2
- PKTIFYGCWCQRSX-UHFFFAOYSA-N 4,6-diamino-2-(cyclopropylamino)pyrimidine-5-carbonitrile Chemical compound NC1=C(C#N)C(N)=NC(NC2CC2)=N1 PKTIFYGCWCQRSX-UHFFFAOYSA-N 0.000 description 1
- 101100191136 Arabidopsis thaliana PCMP-A2 gene Proteins 0.000 description 1
- 102100021975 CREB-binding protein Human genes 0.000 description 1
- 101100426900 Caenorhabditis elegans trd-1 gene Proteins 0.000 description 1
- 101100426971 Caenorhabditis elegans ttr-2 gene Proteins 0.000 description 1
- 101000896987 Homo sapiens CREB-binding protein Proteins 0.000 description 1
- 101000851396 Homo sapiens Tensin-2 Proteins 0.000 description 1
- ORMFWDWAJNERRN-UHFFFAOYSA-N Homophytanic acid Chemical compound CC(C)CCCC(C)CCCC(C)CCCC(C)CCC(O)=O ORMFWDWAJNERRN-UHFFFAOYSA-N 0.000 description 1
- 101100464779 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) CNA1 gene Proteins 0.000 description 1
- 101100048260 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) UBX2 gene Proteins 0.000 description 1
- 102100036852 Tensin-2 Human genes 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000000052 comparative effect Effects 0.000 description 1
- 238000010219 correlation analysis Methods 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000013439 planning Methods 0.000 description 1
- 238000011158 quantitative evaluation Methods 0.000 description 1
- 230000000087 stabilizing effect Effects 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 238000011079 streamline operation Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000012384 transportation and delivery Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04Q—SELECTING
- H04Q9/00—Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04Q—SELECTING
- H04Q2209/00—Arrangements in telecontrol or telemetry systems
- H04Q2209/40—Arrangements in telecontrol or telemetry systems using a wireless architecture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04Q—SELECTING
- H04Q2209/00—Arrangements in telecontrol or telemetry systems
- H04Q2209/80—Arrangements in the sub-station, i.e. sensing device
- H04Q2209/84—Measuring functions
- H04Q2209/845—Measuring functions where the measuring is synchronized between sensing devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04Q—SELECTING
- H04Q2209/00—Arrangements in telecontrol or telemetry systems
- H04Q2209/80—Arrangements in the sub-station, i.e. sensing device
- H04Q2209/88—Providing power supply at the sub-station
Definitions
- the present invention relates to a business microscope system for visualizing the state of an organization, and more particularly to a sensor node which is a terminal for obtaining and transmitting physical quantities in such a system.
- the relationship may be labeled, for example, as a manager, a subordinate, or a friend. It may also include mutual feelings such as like, dislike, confidence, or influence. Communication is a very important tool for establishing a face-to-face relationship. For this reason, the relationship could be examined by obtaining records of communications.
- the sensor network is a technology that is applied to obtain and control a state, by terminals equipped with sensors and a wireless communication circuit, which are attached to an environment, objects, persons and the like to extract various types of information from the sensors through wireless communication.
- Physical quantities are obtained by the sensors for detecting such communication. Examples of the physical quantity include infrared radiation for detecting face-to-face communication state, voice for detecting speech and environment, and acceleration for detecting activity and movements of a person.
- the business microscope system is a system that visualizes the state of an organization by detecting the movements of persons and the face-to-face communication from the physical quantities obtained by sensors. That is, the business microscope system helps to improve the organization.
- the physical quantities obtained by sensors are important for analyzing a face-to-face communication, in connection with the time at which an event occurs and its context, as well as the relationship between physical quantities obtained by different sensors. If any data goes missing, it is difficult to accurately evaluate the face-to-face communication. Thus, the data should be continuously obtained as long as the terminal is attached.
- a terminal is generally operated by a built-in secondary battery, or operated by an external power supply when power is supplied from the outside. It is further described that the built-in secondary battery is charged in parallel while the communication operation is continued regardless of the presence or absence of the external power supply.
- the external power supply when the external power supply is connected, only the destination is changed to the external power supply. The communication content and the communication frequency remain the same as in normal operation.
- data for rewriting the firmware of a sensor node has a size larger than that of the physical data obtained by sensors, and may not be entirely used when only partial data is missing. Transferring such data during battery operation poses problems such as wastefully consuming power and compressing communication bandwidth due to retransmission control.
- the present invention aims at providing a sensor node capable of measuring physical quantities, without increasing power consumption during battery operation and without unnecessarily compressing communication bandwidth, as well as preventing missing data caused by the power supply and communication problems described above.
- the business microscope is assumed to be applied to an office environment, in which physical quantities are obtained by name-tag type sensor nodes.
- Each name-tag type sensor node does not measure physical quantities during night hours when its wearer goes home from the office.
- the name-tag type sensor node operates by a built-in secondary battery when the wearer is in the office, and is attached to a cradle or connected to an external power supply to charge the secondary battery when the wearer is at home.
- the inventors of the present application focused on the fact that an external power supply unit for supplying power from the outside to a name-tag type sensor node, and a base station for communicating with the name-tag type sensor node, are provided on a desk or in the vicinity thereof.
- the name-tag type sensor node and the base station are close to each other above the desk in which communication is stabilized.
- the communication bandwidth is assumed to be unused during this period.
- the present invention is a name-tag type sensor node for obtaining face-to-face communication.
- the name tag type sensor node shifts to a mode for transmitting and receiving a large amount of data by increasing the communication frequency, or a mode for transmitting and receiving data desired to be reliable.
- the sensor node transfers a large amount of data or data desired to be reliable, while preventing battery exhaustion and unnecessary compression of communication bandwidth. In this way, the sensor node can transmit and receive data without missing data.
- FIGS. 1A to 1C are block diagrams showing a flow of processes in a business microscope system
- FIGS. 2A and 2B are block diagrams showing the configuration of a name-tag type sensor node which is a first embodiment, in the entire business microscope system;
- FIGS. 3A to 3E are external views of the name-tag type sensor node according to the first embodiment
- FIGS. 4A to 4C are views showing the placement of infrared transceiver modules in the name-tag type sensor node according to the first embodiment
- FIG. 5 is a view showing the axes of triaxial acceleration that are detected by the name-tag sensor node according to the first embodiment
- FIG. 6 is a view showing the button operation and screen transition of the name-tag type sensor node according to the first embodiment
- FIG. 7 is a view showing an example of the configuration between a cradle and the name-tag type sensor node according to the first embodiment
- FIG. 8 is a view showing the connection relationship among a battery for cradle CRDBATT, cradle CRD, and the name-tag type sensor node according to the first embodiment
- FIG. 9 is a block diagram showing a specific example of the hardware configuration of the name-tag type sensor node according to the first embodiment
- FIG. 10 is a block diagram showing a specific example of the hardware configuration of the cradle CRD for the name-tag type sensor node according to the first embodiment
- FIG. 11 is a block diagram showing a specific example of the hardware configuration of the battery for cradle according to the first embodiment
- FIGS. 12A to 12F are views showing an operation sequence in which the name-tag type sensor node according to the first embodiment obtains and transmits physical quantities from sensors;
- FIGS. 13A to 13D are views showing a data flow and the timing, in which data is transferred from the name-tag type sensor node according to the first embodiment, to a base station, and then transferred to a sensor net server SS;
- FIG. 14 is a flowchart showing the process of the name-tag type sensor node according to the first embodiment.
- FIGS. 15A to 15K are views showing the relationship of operation timings when plural name-tag type sensor nodes according to the first embodiment face each other.
- business microscope is a system to observe the state of a person by a sensor node worn by the person, and display the relationship among persons as activities and evaluation (performance) of an actual organization in a picture, in order to improve the organization.
- the data obtained by the sensor node such as of face-to-face communication detection, activity and movements, voice, is collectively referred to as dynamics data.
- FIGS. 1A , 1 B, 1 C are diagrams showing the overall flow of processes performed in a business microscope system which is an embodiment of the present invention. The flow is divided into several parts for illustrative convenience, but the processes shown in the figures are performed in association with each other. Shown here is a series of flows from the acquisition of organizational dynamics data (BMA) by plural name-tag type sensor nodes (NNa to NNj) to the display (MBF) of the relationships among persons as organizational activities, together with the actual organizational evaluation (performance).
- BMA organizational dynamics data
- Na to NNj plural name-tag type sensor nodes
- MBF display
- the system performs the organizational dynamics data acquisition (BMA), performance input (BMP), organizational dynamics data collection (BMB), inter-data alignment (BMC), correlation coefficient study (BMD), organizational activity analysis (BME), and organizational activity display (BMF) in an appropriate order.
- BMA organizational dynamics data acquisition
- BMP performance input
- BMB organizational dynamics data collection
- BMC inter-data alignment
- BMD correlation coefficient study
- BME organizational activity analysis
- BMF organizational activity display
- a name-tag type sensor node A includes: sensors such as an acceleration sensor (ACC), an infrared transceiver (TRIR), and a microphone (MIC); an image screen (IRD) for displaying face-to-face communication information obtained by the infrared transceiver; a user interface (RTG) for inputting ratings; and a microcomputer and a wireless transmission function that are not shown in the figures.
- sensors such as an acceleration sensor (ACC), an infrared transceiver (TRIR), and a microphone (MIC); an image screen (IRD) for displaying face-to-face communication information obtained by the infrared transceiver; a user interface (RTG) for inputting ratings; and a microcomputer and a wireless transmission function that are not shown in the figures.
- the acceleration sensor (ACC) detects the acceleration of the name-tag type sensor node A (NNa), namely, the acceleration of a person A (not shown) wearing the name-tag type sensor node A (NNa).
- the infrared transceiver (TRIR) detects the facing state of the name-tag type sensor node A (NNa), namely, the state in which the name-tag type sensor node A (NNa) is facing the other name-tag type sensor node.
- the state in which the name-tag type sensor node A (NNa) is facing the other name-tag type sensor node means that the person A wearing the name-tag type sensor node A (NNa) is facing a person wearing the other name-tag type sensor node.
- the microphone (MIC) detects the voice around the name-tag type sensor node A (NNa).
- the system of this embodiment includes plural name-tag type sensor nodes (name-tag type sensor nodes A (NNa) to J (NNj)).
- Each name-tag type sensor node is worn by each person.
- the name-tag type sensor node A (NNa) is worn by the person A
- the name-tag type sensor node B (NNb) is worn by a person B (not shown). This is for the analysis of the relationships among persons, as well as for the display of the performance of the organization.
- the name-tag type sensor nodes B (NNb) to J (NNj) each include the sensors, microcomputer, and wireless transmission function.
- the name-tag type sensor nodes are simply referred to as the name-tag type sensor node (NN).
- the name-tag type sensor node (NN) performs sensing by the sensors constantly (or repeatedly at a short interval). Then, the name-tag type sensor node (NN) wirelessly transmits the obtained data (sensing data) at a predetermined interval. The interval at which the data is transmitted may be the same as the sensing interval, or may be larger than the sensing interval. At this time, the transmission data is added with the sensing time and the identifier (ID) unique to the sensing name-tag type sensor node (NN). The data is wirelessly transmitted in bulk to suppress power consumption, so that the name-tag type sensor node (NN) worn by a person is kept available for a long time. It is preferable that the same sensing interval is given in all the name-tag type sensor nodes (NN) for the subsequent analysis.
- the performance input is a process for inputting values indicating performance.
- the performance means a subjective or objective evaluation that is determined based on certain criteria.
- a person who wears the name-tag type sensor node (NN) inputs values of subjective evaluation (performance) at a predetermined timing, based on certain criteria, such as work achievement at this time, and contribution and satisfaction to the organization.
- the predetermined timing may be, for example, once per several hours, once a day, or the time at which an event such as a meeting is completed.
- the person wearing the name-tag type sensor node (NN) can input values of performance by operating the name-tag type sensor node (NN), or by operating a personal computer (PC) such as a client (CL).
- PC personal computer
- the performance of the organization may be calculated from the performance of an individual.
- the objective data such as sales or cost, as well as the data already digitized such as questionnaire results, may be periodically input as the performance.
- a numerical value is automatically obtained, such as an error occurrence rate in the production management or other fields, the obtained value may be automatically input as a performance value.
- the data wirelessly transmitted from each name-tag type sensor node (NN) is collected in the organizational dynamics data collection (BMB), and stored in a database.
- a data table is generated for each name-tag type sensor node (NN), namely, for each person wearing the name-tag type sensor node (NN).
- the collected data is classified based on the unique ID, and stored in the data table in order of the sensing time. If the table is not generated for each name-tag type sensor node (NN), it is necessary to have a column for the ID information of the name-tag type sensor node or the person, in the data table.
- a data table A (DTBa) in FIG. 1A shows an example of a simplified data table.
- the performance value input in the performance input is stored in a performance database (PDB), together with the time information.
- inter-data alignment data relating to arbitrary two persons is aligned based on the time information to compare the data relating to the two persons (namely, the data obtained by the name-tag type sensor nodes (NN) worn by the relevant persons).
- the aligned data is stored in a table.
- the data of the same time is stored in the same record (line).
- the data of the same time is two data pieces including the physical quantities detected by the two name-tag type sensor nodes (NN) at the same time point.
- the data of the nearest time may be approximately used as the data of the same time. In this case, the data of the nearest time is stored in the same record.
- the times of the data stored in the same record are adjusted, for example, by the average value of the nearest time.
- these data pieces are not necessarily stored in the table, and may be stored in such a way that the data can be compared in chronological order.
- a combined table (CTBab) of FIG. 1A shows an example of a simplified combination of a data table A (DTBa) and a data table B (DTBb).
- the combined table (CTBab) includes the acceleration, infrared, and voice data.
- it is also possible to generate a combined table for each type of data for example, a combined table including only the acceleration data or a combined table including only the voice data.
- the content of the combined table is used as combined table data BMCD 1 and BMCD 2 for the organizational activity analysis (BME) and the correlation coefficient study (BMD) shown in FIGS. 1B and 1C .
- the correlation coefficient study (BMD) is performed in order to calculate the relationship from the organizational dynamics data and estimate the performance ( FIG. 1B ).
- the correlation coefficient is calculated using the data of a certain period in the past. This process may be more effective if the correlation coefficient is periodically recalculated and updated using new data.
- the correlation coefficient can also be calculated using the time series data, such as the voice data, in place of the acceleration data by the same procedure as described below.
- the correlation coefficient study (BMD) is performed by an application server (AS) (see FIG. 2B ), which will be described below.
- AS application server
- the correlation coefficient study (BMD) may actually be performed by a device other than the application server (AS).
- the application server (AS) sets a width T of data used for the calculation of the correlation coefficient, in the range from several days to several weeks.
- the application server (AS) selects the data in this period.
- the acceleration frequency calculation (BMDA) is a process for obtaining a frequency from the acceleration data aligned in chronological order.
- the frequency is defined as the number of wave frequencies per second. In other words, the frequency is an index of the intensity of the frequency.
- Fourier transform is necessary for an accurate calculation of the frequency, and the amount of calculation is increased.
- this embodiment uses a zero-cross value serving as the frequency in order to simplify the calculation.
- the zero-cross value is a count of the number of times the value of the time series data in the certain period is zero. More specifically, the zero-cross value is a count of the number of times the time series data changes from a positive value to a negative value or vice versa. For example, assuming that one cycle is a period of time during which the acceleration value changes from positive to negative and again from negative to positive.
- the frequency per second can be calculated from the counted number of zero-crosses. The frequency per second calculated as described above can be used as the approximate frequency of acceleration.
- the name-tag type sensor node (NN) of this embodiment is equipped with a triaxial acceleration sensor.
- one zero-cross value is calculated by summing the zero-cross values in three axis directions in the same period.
- the pendulum motion is detected in particular in the left-and-right and back-and-forth directions, able to be used as an index of the intensity of the frequency.
- a value larger than the interval of the continuous data (namely, the original sensing interval) is set in the unit of second or minute.
- the application server sets a window width w, which is a time width larger than the zero-cross value and smaller than the entire data width T.
- the frequency distribution and fluctuation are obtained in this window.
- the frequency distribution and fluctuation are calculated for each window by moving the windows sequentially along the time axis.
- a feature amount graph used in the subsequent correlation coefficient calculation is a discrete graph.
- the feature amount graph used for the subsequent correlation coefficient calculation is a continuous graph.
- the width by which the window is moved may be set to an arbitrary value, by considering the fact as described above.
- the zero-cross value is also referred to as frequency.
- the term “frequency” is the concept including the zero-cross value.
- the term “frequency” may be applied to the accurate frequency calculated by Fourier transform, or to the approximate frequency calculated from the zero-cross value.
- the application server performs an individual feature amount extraction (BMDB).
- BMDB is a process for extracting the feature amount of an individual by calculating the frequency distribution and frequency fluctuation of the acceleration in each window.
- the application server obtains the frequency distribution (namely, the intensity) (DB 12 ).
- the frequency distribution is a frequency at which the acceleration of each frequency occurs.
- the frequency distribution of the acceleration reflects what a person wearing the name-tag type sensor node (NN) does and how long it takes. For example, the frequency of the acceleration is different between when the person is walking and when writing e-mail by a PC. In order to record a histogram of such an acceleration history, the occurrence frequency of the acceleration is obtained for each frequency.
- the application server (AS) determines the frequency assumed (or desired) to be maximum. Then the application server (AS) divides the value from 0 to the determined maximum value, into 32 segments. Then the application server (AS) counts the number of acceleration data included in each of the divided frequency ranges. In this way, the occurrence frequency of the acceleration is calculated for each frequency, and is treated as the feature amount. The same process is performed for each window.
- the application server calculates the “fluctuation for each frequency” (DB 11 ), in addition to the frequency distribution of the acceleration.
- the fluctuation of frequency is the value indicating how long the frequency of the acceleration is continuously maintained.
- the fluctuation for each frequency is an index of how long the activity of a person is maintained. For example, suppose a person walks for 30 minutes in an hour. The meaning of the activity is different between the case where the person repeats a one minute walk and a one-minute stop and the case where the person continues to walk for 30 minutes after a 30-minute break. These activities can be distinguished by calculating the fluctuation for each frequency.
- the magnitude of fluctuation largely changes depending on the setting of criteria, namely, the range of differences between two continuous values in which it is determined that the value of the acceleration frequency is maintained. Further, information about the dynamics of data, such as whether the value changes a little or a lot, could be missing. For this reason, in this embodiment, the entire range of the acceleration frequency is divided into a predetermined division number.
- the entire range of the frequency corresponds to the range from the frequency “0” to the maximum value of the frequency (see Step DB 12 ).
- the divided segments are used as the criteria for determination whether the value is maintained or not. For example, when the division number is 32, the entire range of the frequency is divided into 32 segments.
- the acceleration frequency at a certain time t is in the ith segment and the acceleration frequency at the next time t+1 is in the (i ⁇ 1)th, ith, or (i+1)th segment, it is determined that the value of the acceleration frequency is maintained.
- the acceleration frequency at the time t+1 is not in the (i ⁇ 1)th, ith, or (i+1)th segment, it is determined that the value of the acceleration frequency is not maintained.
- the number of times of determining that the value is maintained is counted as the feature amount of the fluctuation. The above process is performed for each window.
- the feature amounts of the fluctuation with the division numbers 16, 8, and 4 are calculated, respectively. In this way, by changing the division number in the calculation of the fluctuation for each frequency, both small and large changes can be reflected in any of the feature amounts.
- the above description has focused on the example of calculating the frequency distribution and fluctuation of the acceleration.
- the application server (AS) can perform the same process as described above for other data (for example, the voice data) than the acceleration data. As a result, the feature amount is calculated based on the obtained data.
- the application server (AS) calculates 32 patterns of the frequency distribution and 60 patterns of the fluctuation magnitude for each frequency, or 92 values in total.
- the application server (AS) treats the values as the feature amounts of the person A in the windows in the time frame (DB 13 ).
- the 92 feature amounts (X A1 to X A92 ) are all independent.
- the application server calculates the above feature amounts based on the data transmitted from the name-tag type sensor nodes (NN) of all the members belonging to the organization (or all the members desired to be analyzed). Since the feature amounts are calculated for each window, the feature amounts for one member can be treated as a time series data by plotting the feature amounts in order of the time of the window.
- the time of the window can be determined according to an arbitrary rule. For example, the time of the window may be the time at the center of the window, or may be the time at the beginning of the window.
- the feature amounts (X A1 to X A92 ) are the feature amounts for the person A, which are calculated based on the acceleration detected by the name-tag type sensor node (NN) worn by the person A.
- feature amounts (for example, X B1 to x B92 ) for the other person (for example, person B) are calculated based on the acceleration detected by the name-tag type sensor node (NN) worn by the person B.
- the application server performs an intercorrelation calculation (BMDC).
- the intercorrelation calculation (BMDC) is a process for obtaining the intercorrelation of the feature amounts for two persons. Suppose the two persons are person A and person B.
- a feature amount x A shown in the intercorrelation calculation (BMDC) of FIG. 1B is a graph plotting the time-series change of the feature amount of the person A.
- a feature amount x B shown in the intercorrelation calculation (BMDC) is a graph of the feature amount of the person B.
- the influence of the feature amount (for example, x A1 ) of the person A on the feature amount (for example, x B1 ) of the person B is expressed as the function of time ⁇ :
- x A1 (t) the value of the feature amount x 1 of the person A at the time t
- x A1 the average value of the feature amount x 1 of the person A in the time range of 0 to T
- T is the time width in which the data of the frequency exists.
- the activity of the person B at a certain time is likely to be similar to the activity of the person A done before ⁇ 1 from that time.
- the feature amount x B1 of the person B is influenced after the time ⁇ 1 has passed from the occurrence of the activity of the feature amount x A1 in the person A.
- the value ⁇ , at which the peak appears represents a type of the influence. For example, it could be said that ⁇ of less than a few seconds indicates the influence of a face-to-face communication such as nodding, whereas ⁇ of several minutes to several hours indicates the influence of an activity.
- the application server (AS) applies the procedure of the intercorrelation calculation to 92 patterns which is the number of the feature amounts of the person A and the person B. Further, the application server (AS) calculates the feature amounts for each pair of the members belonging to the organization (or the members desired to be analyzed), by the above described procedure.
- the application server obtains plural feature amounts of the organization, from the results obtained by the intercorrelation calculation relating to the feature amounts. For example, the time domain is divided into several periods, such as within an hour, within a day, and within a week. The values for each pair of persons are treated as the feature amounts (BMDD). Then, the constants are determined as the feature amounts from the results of the intercorrelation calculation. At this time, it is also possible to use a method other than the method described above. In this way, one organizational feature amount is obtained from one intercorrelation calculation. When the number of individual feature amounts is 92, the square of 92 for each pair, namely, 8464 organizational feature amounts can be obtained.
- the intercorrelation reflects the influence and relationship of the two members belonging to the organization. For this reason, by using the values obtained by the intercorrelation calculations as the feature amounts of the organization, it is possible to treat the organization, which is realized through human relationship, in a quantitative manner.
- the application server obtains the data of quantitative evaluation of the organization (hereinafter referred to as performances) from the performance database (PDB) as PDBD shown in FIG. 1A (BMDE).
- performances may be calculated, for example, from the achievements of an individual that each person declared, or the results of subjective evaluation relating to human relationships and the like in the organization. It is also possible to use the financial evaluation of the organization, such as sales and loss, as the performances.
- the performances are obtained from the performance database (PDB) of the organizational dynamics data collection (BMB), and processed together with the time information at which the performances were evaluated.
- indexes p 1 to p 6
- performances of the organization are examples of six indexes (p 1 to p 6 ), such as sales, customer satisfaction, cost, error rate, growth, and flexibility, as the performances of the organization.
- the application server (AS) performs correlation analysis between the organizational feature amounts and each of the organizational performances (BMDF).
- BMDF organizational performances
- the organizational feature amounts are enormous and include unnecessary feature amounts.
- the application server (AS) selects only effective feature amounts by a stepwise method (BMDG).
- the application server (AS) may also use a method other than the stepwise method for the selection of the feature amounts.
- m is 92 in the example of FIG. 1B . This is performed for p 1 to p 6 to determine A 1 to A 6 for p 1 to p 6 , respectively.
- the simplest linear modeling is done.
- values X 1 , X 2 and so on determined by the non-linear mode may be adopted in order to increase the accuracy. It is also possible to use a method such as a neural network.
- the organizational activity analysis (BME) shown in FIG. 1C is a process for obtaining a relationship between arbitrary two persons in the combined table, from the data such as the acceleration, voice, and face-to-face communication, and then calculating the performances of the organization.
- acceleration frequency calculation EA 12
- individual feature amount extraction EA 13
- intercorrelation calculation between persons EA 14
- organizational feature amount calculation EA 15
- BMDA acceleration frequency calculation
- BMDB individual feature amount extraction
- BMDC intercorrelation calculation
- BMDD organizational feature amount calculation
- This value is an estimation of the organizational performance (EA 17 ).
- the latest values of the six indexes of the organizational performances are balanced and displayed. Further, the history of the value of one index is displayed as an index estimation history in a time-series graph.
- the distance between arbitrary persons which is obtained from the value of the intercorrelation between the persons (EK 41 ), is used for determining a parameter (organizational structure parameter) to display the organizational structure.
- the distance between the persons is an index of the relationship between the persons, and not the geographic distance. For example, the stronger the relationship between the persons is (for example, the stronger the intercorrelation between the persons), the shorter the distance therebetween is.
- grouping is performed based on their distance between the persons (EK 42 ) to determine a group in the display.
- the grouping is a process for generating pairs of persons having close relationships with each other. That is, a pair of at least two persons A and B particularly having a close relationship is defined as one group. A pair of at least two persons C and D having another close relationship is defined as one group. Then, a group of these persons A, B, C, and D is defined as a large group.
- the infrared data includes information about who meets who and when it occurs.
- the application server (AS) analyzes the face-to-face communication history by the infrared data (EI 22 ). Then the application server (AS) defines parameters for displaying the organizational structure based on the face-to-face communication history (EK 43 ). At this time, it is also possible that the application server (AS) calculates the distance between arbitrary persons from the face-to-face communication history, and defines a parameter based on the calculated distance. For example, the distance between the persons is calculated to be shorter (namely, their relationship is stronger) as the number of their face-to-face communications is increased in a predetermined period.
- the application server may determine the parameters, in such a way that the total number of face-to-face communications in one person is reflected in the size of the node, the number of short-term face-to-face communications among persons is reflected in the distance among nodes, and the number of long-term face-to-face communications between arbitrary persons is reflected in the thickness of the link.
- the node is an image displayed to indicate each person on a display (CLOD) of the client (CL).
- the link is a line connecting two nodes.
- the displayed node is larger as the relevant person has communicated face-to-face with many persons until now, regardless of who they are.
- the two nodes are positioned closer to each other as the relevant two persons have frequently communicated face-to-face in recent days.
- the two nodes are connected by a thicker link as the relevant two persons have communicated face-to-face for a long time.
- the application server can reflect the attribute information of a user wearing the name-tag type sensor node, in the display of the organizational structure.
- the color of the node indicating a person may be determined by the age of the person.
- the shape of the node may be determined according to the title of the position.
- the intercorrelation between persons can be calculated using the voice data in place of the acceleration data.
- a conversation feature amount (EV 33 ) by extracting a voice feature amount from the voice data (EV 32 ) and by analyzing the extracted feature amount together with the communication data.
- the conversation feature amount is an amount indicating, for example, tone of voice in conversation, rhythm of dialogue, or conversation balance.
- the conversation balance is an amount indicating whether one of two persons monopolizes the conversation or the two persons share the conversation.
- the conversation balance is extracted based on the voices of the two persons.
- the application server may determine a parameter of the display so that the conversation balance is reflected in the angle between the nodes. More specifically, for example, when two persons share the conversation, the nodes indicating the persons may be displayed in parallel. When one of the two persons monopolizes the conversation, the node indicating the talking person may be displayed above the node of the other person. It is also possible to display so that the stronger the tendency that one person monopolizes the conversation, the larger the angle between a line connecting the nodes indicating the two persons and a reference line (namely, an angle ⁇ AB or ⁇ CD in the example of the organizational structure display (FC 31 ) in FIG. 1C ).
- the reference line is, for example, a line provided in the lateral direction of the display (namely, in the horizontal direction). The reference line may not be displayed on the display.
- the organizational activity display is a process for generating an index balance display (FA 11 ), index estimation history (FB 21 ), and organizational structure display (FC 31 ) and the like, from the organizational performance estimation and organizational structure parameters calculated by the above described processes.
- the generated data is displayed on the display (CLOD) of the client (CL), or other display means.
- An organizational activity (FD 41 ) of FIG. 1C is an example of an image displayed on the display (CLOD) of the client (CL).
- the selected display period, and the unit desired to be displayed or plural members are first displayed.
- the unit is an organization having plural members. There may be displayed all the members belonging to one unit, or plural members who are a part of the unit.
- the results of the analysis based on the conditions such as the display period and the unit are displayed as three different images.
- the image of the index estimation history (FB 21 ) is an example of an estimation result history on the performance of “growth”. This makes it possible to analyze what activity of the member benefits the organization, and what is effective to turn from negative to positive, by comparing to the past activity history.
- the organizational structure display (FC 31 ) visualizes the states of the small groups that constitute the organization, the role that each member actually has in the organization, the balance between arbitrary members, and the like.
- the index balance display (FA 11 ) shows the balance of the estimation of the specified six organizational performances. This makes it possible to figure out the strengths and weaknesses of the actual organization.
- FIGS. 2A , 2 B are block diagrams showing the overall configuration of a sensor network system that realizes the business microscope system.
- the five arrows of different shape in FIGS. 2A , 2 B indicate the time synchronization, associate, storage of obtained sensing data, data flow for data analysis, and control signal.
- the business microscope system includes a name-tag sensor node (NN), a base station (GW), a sensor network server (SS), an application server (AS), and a client (CL).
- N name-tag sensor node
- GW base station
- SS sensor network server
- AS application server
- CL client
- Each of the functions is realized by hardware, software, or the combination thereof.
- the function block is not necessarily associated with a hardware entity.
- FIG. 2A shows the configuration of the name-tag type sensor node (NN) which is an example of a sensor node.
- the name-tag type sensor node (NN) is equipped with various sensors, including plural infrared transceivers (TRIR 1 to TRIR 4 ) for detecting the face-to-face communication state; a triaxial acceleration sensor (ACC) for detecting the activity of a wearer, a microphone (MIC) for detecting the speech of the wearer as well as the sound around the wearer, luminance sensors (LS 1 F, LS 1 B) for detecting the back and front of the name-tag type sensor node, and a temperature sensor (THM).
- TMM temperature sensor
- the name-tag type sensor node (NN) includes four infrared transceivers.
- the infrared transceivers (TRIR 1 to TRIR 4 ) continue to periodically transmit terminal information (TRMD) which is the unique identification information of the name-tag type sensor node (NN), in the front direction.
- TRMD terminal information
- the name-tag type sensor node (NNm) and the other name-tag type sensor node (NNm) exchange their terminal information (TRMD) by infrared radiation.
- Each infrared transceiver is generally formed by a combination of an infrared emitting diode for infrared transmission, and an infrared phototransistor.
- An infrared ID transmitter IrID generates its own ID, TRMD, and transfers to the infrared emitting diode of the infrared transceiver module.
- TRMD infrared ID transmitter
- the same data is transmitted to plural infrared transceiver modules, and then all the infrared emitting diodes are lighted at the same time.
- the data received by the infrared phototransistors of the infrared transceivers TRIR 1 to TRIR 4 is ORed by a logical sum circuit (IROR).
- IROR logical sum circuit
- the ID is received by at least one of the infrared receivers, it is recognized by the name-tag type sensor node as an ID.
- the transmission and reception state can be provided for each of the infrared transceiver modules. For example, it is possible to obtain additional information about the direction of the other facing name-tag type sensor node.
- Physical quantity data SENSD detected by the sensors is stored in a memory unit STRG by a sensor data storage controller (SDCNT).
- the physical quantity data is converted to a transmission packet by a wireless communication controller TRCC, and is transmitted to the base station GW by a transceiver unit TRSR.
- a communication timing controller TRTMG generates a timing for extracting the physical quantity data SENSD from the memory unit STRG and wirelessly transmitting the data.
- the communication timing controller TRTMG has plural time bases to generate plural timings.
- Examples of the data stored in the memory unit are a past-accumulated physical quantity CMBD, and data FMUD for updating the firmware which is an operation program of the name-tag type sensor node, in addition to the physical quantity data SENSD currently detected by the sensors.
- the name-tag type sensor node (NN) of this example detects the connection of an external power supply (EPOW) by an external power detection circuit (PDET), and generates an external power detection signal (PDETS).
- the external power detection signal (PDETS) is used in a time base selector (TMGSEL) to switch the transmission timing generated by the timing controller (TRTMG).
- TMGSEL time base selector
- TRTMG timing controller
- TRDSEL data selector
- FIG. 2A shows an example of the configuration in which the time base selector TMGSEL switches the transmission timing between a time base 1 (TB 1 ) and a time base 2 (TB 2 ) by the external power detection signal PDETS, and in which the data selector TRDSEL switches the data to be communicated among the physical quantity data SENDSD obtained by the sensors, the past-accumulated physical quantity data CMBD, and the firmware update data FIRMUPD, by the external power detection signal PDETS.
- the luminance sensors (LS 1 F, LS 1 B) are mounted on the front and back of the name-tag type sensor node (NN), respectively.
- the data obtained by the luminance sensors LS 1 F and LS 1 B is stored in the memory unit STRG by a sensor data storage controller SDCNT.
- the obtained data is compared by a reverse detection unit (FBDET).
- FBDET reverse detection unit
- the front luminance sensor LS 1 F receives incoming light.
- the back luminance sensor LS 1 B is positioned between the main body of the name-tag type sensor node and the wearer, receiving no incoming light. At this time, the luminance detected by LS 1 F is larger than the luminance detected by LS 1 B.
- LS 1 B receives incoming light. At this time, the luminance detected by LS 1 B is larger than the luminance detected by LS 1 F facing the wearer's side because LS 1 F faces the wearer.
- the luminance detected by LS 1 F and the luminance detected by LS 1 B are compared by the reverse detection unit FBDET in order to detect that the name-tag type sensor node is reversed and incorrectly worn.
- a speaker SP When a reverse is detected by FBDET, a speaker SP generates a warning sound to notify the wearer.
- the microphone obtains voice information. From the voice information it is possible to know, for example, that the surrounding environment is “noisy” or “quiet”. Further, by obtaining and analyzing the voices of persons, it is possible to analyze the face-to-face communication, such as whether the communication is active or inactive, the conversation is monopolized or shared, and the speakers are angry or smile. Further, even if the infrared transceivers TRIR do not detect the face-to-face communication state due to the standing positions of the persons or other reasons, it is possible to compensate by the voice information and the acceleration information.
- the voice obtained from the microphone MIC is provided as a voice waveform and as a signal integrated by an integrating circuit AVG.
- the integrated signal represents the energy of the obtained voice.
- the triaxial acceleration sensor (ACC) detects the acceleration of the node, namely, the movement of the node.
- ACC The triaxial acceleration sensor
- the data obtained by the triaxial acceleration sensor ACC is stored in the memory unit STRG by the sensor data storage controller SDCNT.
- the orientation of the name tag is detected by an up/down detection circuit UDDET. This uses the fact that the triaxial acceleration sensor detects two types of acceleration, which are observed as dynamic acceleration change in the wearer's movement and as static acceleration of the Earth's gravity.
- the display device LCDD displays the personal information such as the department and name of the wearer when wearing the name-tag type sensor node on the chest.
- the name-tag type sensor node acts as a name tag.
- the wearer holds the name-tag type sensor node in the hand and turns the display device LCDD toward the wearer, the name-tag type sensor node is upside down.
- the content displayed on the display device LCDD, as well as the button function are switched by an up/down detection signal UDDETS generated by the up/down detection circuit UDDET.
- the information displayed on the display device LCDD is switched between the analysis results of the infrared activity analysis (ANA) that is generated by a display controller DISP, and the name tag display DNM, based on the value of the un/down detection signal UDDETS.
- ANA infrared activity analysis
- the infrared radiation is exchanged between the nodes by their transceivers (TRIR), in order to detect whether the name-tag type sensor node is facing the other name-tag type sensor node, namely, whether a person wearing the name-tag type sensor node is facing a person wearing the other name-tag type sensor node.
- TIR transceivers
- the name-tag type sensor nodes are worn in the front portion of the persons.
- the name-tag type sensor node further includes the sensor such as the acceleration sensor (ACC).
- ACC acceleration sensor
- the process of sensing in the name-tag type sensor node corresponds to the organizational dynamics data acquisition (BMA) in FIG. 1A .
- PAN personal area network
- the temperature sensor (THM) of the name-tag type sensor node (NN) obtains temperature of the location of the name-tag type sensor node.
- the luminance sensor (LS 1 F) obtains luminance in the front direction or other direction of the name-tag type sensor node (NN). In this way, it is possible to record the surrounding environment. For example, the movement of the name-tag type sensor node (NN) from a certain place to another place can be found based on the temperature and the luminance.
- the name-tag type sensor node includes buttons 1 to 3 (BTN 1 to BTN 3 ), display device (LCDD), speaker (SP) and the like, as an input/output device for the wearer.
- the memory unit (STRG) includes, in particular, a hard disc, and a nonvolatile memory device such as a flash memory.
- the memory unit (STRG) stores terminal information (TRMT) which is the unique identification number of the name-tag type sensor node (NN), interval of sensing, and operation setting (TRMA) such as the content output to the display.
- TRMT terminal information
- the memory unit (STRG) can temporarily store data, and is used for storing the sensing data.
- the communication timing controller is a clock for maintaining time information and updating the time information at a predetermined interval.
- the time information is periodically corrected by time information GWCSD transmitted from the base station (GW), in order to prevent the time lag between the time information and that of the other name-tag type sensor node.
- the sensor data storage controller manages the obtained data by controlling the sensing interval and the like of the sensors according to the operation setting (TRMA) stored in the memory unit (STRG).
- the time synchronization obtains the time information from the base station (GW) and corrects the clock.
- the time synchronization may be performed immediately after an associate operation described below, or may be performed in response to a time synchronization command transmitted from the base station (GW).
- the wireless communication controller (TRCC) is involved in the data transmission and reception, controlling the transmission interval and converting the data into a data format appropriate for wireless transmission and reception.
- the wireless communication controller may include a wired communication function instead of wireless, if necessary.
- the wireless communication controller sometimes performs congestion control so that the transmission timing does not overlap with the transmission timing of the other name-tag type sensor node (NN).
- the associate (TRTA) transmits a request TRTAQ to form a personal network area (PAN) with the base station (GW) shown in FIG. 2B , and receives a response TRTAR to the request.
- the base station (GW) to which the data should be transmitted is determined.
- the associate (TRTA) is performed when the power of the name-tag type sensor node (NN) is turned on, and when the transmission/reception with the base station (GW) is stopped due to movement of the name-tag type sensor node (NN).
- the name-tag type sensor node (NN) is associated with the base station (GW) located nearby so that the wireless signal reaches from the name-tag type sensor node (NN).
- the transceiver unit includes an antenna to transmit and receive wireless signals.
- the transceiver unit (TRSR) can also transmit and receive using a connector for wired communication, if necessary.
- the data TRSRD is transmitted and received between the transceiver unit TRSR and the base station (GW) through the personal area network (PAN).
- the base station (GW) shown in FIG. 2B has a function of intermediating between the name-tag type sensor node (NN) and the sensor network server (SS). By considering the wireless range, plural base stations (GW) are provided so as to cover areas such as living room and work place.
- the base station includes a transceiver unit (BASR), a memory unit (GWME), a clock (GWCK), and a controller (GWCO).
- BASR transceiver unit
- GWME memory unit
- GWCK clock
- GWCO controller
- the transceiver unit receives wireless signals from the name-tag type sensor node (NN), and performs wired or wireless transmission to the base station (GW). Further, the transceiver unit (BASR) includes an antenna for receiving wireless signals.
- the memory unit (GWME) includes a hard disc, and a nonvolatile memory device such as a flash memory.
- the memory unit (GWME) stores at least operation setting (GWMA), data format information (GWMF), terminal management table (GWTT), and base station information (GWMG).
- the operation setting (GWMA) includes the information about the operation method of the base station (GW).
- the data format information (GWMF) includes the information about the data format for data communication as well as the information about the tag to be added to the sensor data.
- the terminal management table (GWTT) includes the terminal information (TRMT) of the children name-tag type sensor nodes (NN) in which the association is actually established, and the local IDs provided to manage the name-tag type sensor nodes (NN).
- the base station information (GWMG) includes the information such as the address of the own base station (GW). Further, the memory unit (GWME) temporarily stores the updated firmware (GWTF) of the name-tag type sensor node.
- the memory unit (GMWE) may also store programs to be executed by a central processing unit CPU (not shown) within the controller (GWCO).
- the clock (GWCK) maintains time information.
- the time information is updated at a predetermined interval. More specifically, the time information of the clock (GWCK) is corrected by the time information obtained from an NTP (Network Time Protocol) server at a predetermined interval.
- NTP Network Time Protocol
- the controller includes the CPU (not shown).
- the CPU executes the programs stored in the memory unit (GWME) to manage the acquisition timing of sensing data sensor information, the processing of the sensor data, the transmission/reception timing to the name-tag type sensor node (NN) as well as the sensor network server (SS), and the time synchronization timing. More specifically, the CPU executes the programs stored in the memory unit (GWME) to perform the processes of wireless communication control/communication control (GWCC), data format conversion (GWDF), associate (GWTA), and time synchronization management (GWCD).
- GWCC wireless communication control/communication control
- GWDF data format conversion
- GWTA associate
- GWCD time synchronization management
- the wireless communication control/communication control controls the timing of wireless or wired communication with the name-tag type sensor node (NN) and the sensor network server (SS). Further, the wireless communication control/communication control (GWCC) classifies the type of received data. More specifically, the wireless communication control/communication control (GWCC) identifies whether the received data is general sensing data, data for an association operation, or a response of the time synchronization or others, from the header portion of the data. Then the wireless communication control/communication control (GWCC) passes these data pieces to the appropriate functions, respectively.
- the wireless communication control/communication control performs the data format conversion (GWDF). More specifically, the wireless communication control/communication control (GWCC) refers to the data format information (GWMF) stored in the memory unit (GWME), converts the data to an appropriate format for transmission/reception, and adds the tag information for indicating the type of the data.
- GWDF data format conversion
- GWMF data format information
- GWME memory unit
- the associate (GWTA) transmits the response TRTAR to the associate request TRTAQ transmitted from the name-tag type sensor node (NN), and transmits the local ID assigned to the name-tag type sensor node (NN). Once an association is established, the associate (GWTA) performs a terminal management information correction (GWTF) to correct the terminal management table (GWTT).
- GWTF terminal management information correction
- the time synchronization management controls the interval and timing at which the time synchronization is performed, and issues an instruction to perform the time synchronization. It is also possible that the sensor network server (SS) performs the time synchronization management (GWCD) and transmits the instruction to all the base stations (GW) in the system, which will be described below.
- the time synchronization (GWCS) is connected to the NTP server (TS) on the network to request and obtain time information.
- the time synchronization (GWCS) corrects the clock (GWCK) based on the obtained time information.
- the time synchronization (GWCS) transmits the time synchronization instruction and the time information (GWCSD), to the name-tag type sensor node (NN).
- the sensor network server (SS) of FIG. 2B manages the data collected from all the name-tag type sensor nodes (NN). More specifically, the sensor network server (SS) stores the data transmitted from the base station (GW) into the database, while transmitting the sensing data based on the requests from the application server (AS) and the client (CL). Further, the sensor network server (SS) receives a control command from the base station (GW), and transmits the result obtained by the control command to the base station (GW).
- the sensor network server (SS) includes a transceiver unit (SSSR), a memory unit (SSME), and a controller (SSCO).
- the sensor network server (SS) should also have a clock when performing the time synchronization management (GWDC).
- the transceiver unit performs data transmission and reception with the base station (GW), the application server (AS), and the client (CL). More specifically, the transceiver unit (SSSR) receives the sensing data transmitted from the base station (GW), and transmits the sensing data to the application server (AS) or the client (CL).
- the memory unit (SSME) includes a hard disc, and a nonvolatile memory device such as a flash memory.
- the memory unit (SSME) stores at least performance database (SSMR), data format information (SSMF), sensing database (SSDB), and terminal management table (SSTT).
- the memory unit (SSME) may also store programs to be executed by a CPU (not shown) of the controller (SSCO). Further, the memory unit (SSME) temporarily stores the updated firmware (GWTF) of the name-tag type sensor node, which was once stored in a terminal firmware registration unit (TFI).
- GWTF updated firmware
- the performance database is a database for storing evaluations (performances) of the organization and individuals, which are input from the name-tag type sensor nodes (NN) or from the existing data, together with the time data.
- the performance database is the same as the performance database (PDB) of FIG. 1A .
- the performance data is input from a performance input unit (MRPI).
- the data format information includes a data format for communication, a method for separating the sensing data with a tag added in the base station (GW) and for storing in the database, and a method for responding to requests for data.
- the communication controller refers to the data format information (SSMF), typically after data reception and before data transmission, in order to perform data format conversion (SSDF) and data distribution (SSDS).
- the sensing database is a database for storing the sensing data obtained by the name-tag type sensor nodes (NN), information of the name-tag type sensor nodes (NN), and information of the base stations (GW) through which the sensing data is transmitted from the name-tag type sensor nodes (NN), and the like. Columns are generated for each of the data elements such as acceleration and temperature to manage the data. It is also possible to generate tables for each of the data elements. In both cases, all the data is managed in association with the terminal information (TRMT) which is the ID of the obtained name-tag type sensor node (NN), and the information about the obtained time.
- TRMT terminal information
- the terminal management table (SSTT) is a table containing information about which name-tag type sensor node (NN) is actually under the control of which base station (GW). When another name-tag type sensor node (NN) is added to the base station (GW), the terminal management table (SSTT) is updated.
- the controller includes the central processing unit CPU (not shown) to control the transmission/reception of sensing data, and control the reading/writing of sensing data from/to the database. More specifically, the CPU executes the programs stored in the memory unit (SSME) to perform the processes of communication control (SSCC), terminal management information correction (SSTF), and data management (SSDA).
- SSME memory unit
- SSCC communication control
- SSTF terminal management information correction
- SSDA data management
- the communication control controls the timing of wired or wireless communication with the base station (GW), the application server (AS), and the client (CL). Further, as described above, the communication control (SSCC) converts the format of the data to be transmitted and received, to the data format in the sensor network server (SS), or to the data format specific to each communication target, based on the data format information (SSMF) stored in the memory unit (SSME). Then, the communication control (SSCC) reads the header portion indicating the type of the data, and distributes the data to the corresponding process. More specifically, the received data is transferred to the data management (SSDA), and the command for correcting the terminal management information is transferred to the terminal management information correction (SSTF). The destination of the transmission data is determined to be the base station (GW), the application server (AS), or the client (CL).
- SSDA data management
- SSTF terminal management information correction
- the terminal management information correction receives the command from the base station (GW) to correct the terminal management information, and updates the terminal management table (SSTT).
- the data management (SSDA) manages the correction, acquisition, and addition of the data in the memory unit (SSME) For example, the data management (SSDA) stores each element of the sensing data into the appropriate column of the database based on the tag information. When the sensing data is read from the database, the data management (SSDA) performs processes such as selecting necessary data based on the time information and the terminal information, and sorting the data in order of time.
- SSME memory unit
- the sensor network server receives data through the base station (GW). Then, the data management (SSDA) classifies the received data, and stores in the performance database (SSMR) and in the sensing database (SSDB). This corresponds to the organizational dynamics data collection (BMB) in FIG. 1A .
- the application server (AS) shown in FIG. 2B analyzes and processes the sensing data.
- An analysis application is activated upon request from the client (CL), or automatically at a specified time.
- the analysis application transmits a request to the sensor network server (SS) and obtains necessary sensing data. Further, the analysis application analyzes the obtained data, and then transmits the analyzed data to the client (CL). It is also possible that the analysis application stores the analyzed data directly to an analysis database.
- the application server includes a transceiver unit (ASSR), a memory unit (ASME), and a controller (ASCO).
- ASSR transceiver unit
- ASME memory unit
- ASCO controller
- the transceiver unit (ASSR) performs the data transmission and reception with the sensor network server (SS) and the client (CL). More specifically, the transceiver unit (ASSR) receives a command transmitted from the client (CL), and transmits a data acquisition request to the sensor network server (SS). Further, the transceiver unit (ASSR) receives the sensing data from the sensor network server (SS), and transmits analyzed data to the client (CL).
- the memory unit (ASME) includes a hard disc, and an external memory device such as a memory or an SD card.
- the memory unit (ASME) stores the setting conditions for analysis and the analyzed data. More specifically, the memory unit (ASME) stores a display condition (ASMJ), analysis algorithm (ASMA), analysis parameter (ASMP), terminal information-name (ASMT), analysis database (ASMD), correlation coefficient (ASMS), and combined table (CTB).
- ASMJ display condition
- ASMA analysis algorithm
- ASMP analysis parameter
- ASMT analysis parameter
- ASMT analysis database
- ASMD correlation coefficient
- CTB combined table
- the display condition (ASMJ) temporarily stores conditions for display requested from the client (CL).
- the analysis algorithm stores programs for analysis.
- the appropriate program is selected in response to the request from the client (CL).
- the analysis is performed by the selected program.
- the analysis parameter (ASMP) stores, for example, parameters for feature amount extraction. When the parameters are changed in response to the request from the client (CL), the analysis parameter (ASMP) is rewritten.
- the terminal information-name is a comparative table of the ID of a terminal, and the name and attribute or other information of a person wearing the terminal.
- the name of the person is added to the terminal ID of the data received from the sensor network server (SS).
- the name of the person is converted to the terminal ID and a data acquisition request is transmitted to the sensor network server (SS), with reference to the terminal information-name (ASMT).
- the analysis database is a database for storing the analyzed data.
- the analyzed data may be temporarily stored before transmission to the client (CL). It is also possible that a large amount of analyzed data is stored so that the analyzed data can be freely obtained in bulk. When the data is transmitted to the client (CL) while being analyzed, there is no need to use the analysis database (ASMD).
- the correlation coefficient (ASMS) stores correlation coefficients determined by the correlation coefficient study (BMD).
- the correlation coefficient (ASMS) is used for the organizational activity analysis (BME).
- the combined table is a table for storing data relating to plural name-tag type sensor nodes aligned by the inter-data alignment (BMC).
- the controller includes a central processing unit CPU (not shown) to control the data transmission/reception and to analyze the sensing data. More specifically, the CPU (not shown) executes the programs stored in the memory unit (ASME) to perform communication control (ASCC), analysis condition setting (ASIS), data acquisition request (ASDR), inter-data alignment (BMC), correlation coefficient study (BMD), and organizational activity analysis (BME), and terminal information-user reference (ASDU), or other processes.
- ASCC communication control
- ASIS analysis condition setting
- ASDR data acquisition request
- BMC inter-data alignment
- BMD correlation coefficient study
- BME organizational activity analysis
- ASDU terminal information-user reference
- the communication control controls the timing of wired or wireless communication with the sensor network server (SS) and the client (CL). Further, the communication control (ASCC) converts the format of the data, and distributes the data to appropriate destinations according to data types.
- the analysis condition setting (ASIS) receives analysis conditions set by a user (US) through the client (CL), and stores the received analysis conditions into the memory unit (ASME). Further, the analysis condition setting (ASIS) generates a command to request data to the server, and transmits a data acquisition request (ASDR).
- ASDR data acquisition request
- the correlation coefficient study (BMD) is a process corresponding to the correlation coefficient study (BMD) in FIG. 1B .
- the correlation coefficient study (BMD) is performed using the analysis algorithm (ASMA).
- the result is stored in the correlation coefficient (ASMS).
- the organizational activity analysis (BME) is a process corresponding to the organizational activity analysis (BME) in FIG. 1C .
- the organizational activity analysis (BME) is performed by obtaining the stored correlation coefficient (ASMS) and using the analysis algorithm (ASMS).
- the results of the analysis are recorded in the analysis database (ASMD).
- the terminal information-user reference converts the data managed with the terminal information (ID) into the name or other designation of the user of each terminal, based on the terminal information-name (ASMT).
- the terminal information-user reference (ASDU) may also provide additional information such as the title and position of the user.
- the terminal information-user reference (ASDU) may be omitted, if not necessary.
- the client (CL) shown in FIG. 2B interfaces with the user (US) for inputting and outputting data.
- the client (CL) includes an input/output unit (CLIO), a transceiver unit (CLSR), a memory unit (CLME), and a controller (CLCO).
- the input/output unit (CLIO) serves as an interface with the user (US).
- the input/output unit (CLIO) includes a display (CLOD), a keyboard (CLIK), and a mouse (CLIM) and the like. It is also possible to connect another input/output device to an external input/output (CLIU) according to the necessity.
- the display is an image display device such as a CRT (Cathode-Ray Tube) or a liquid crystal display.
- the display (CLOD) may include a printer and the like.
- the transceiver unit (CLSR) performs the data transmission and reception with the application server (AS) or the sensor network server (SS). More specifically, the transceiver unit (CLSR) transmits the analysis conditions to the application server (AS) and receives the analysis results.
- the memory unit (CLME) includes a hard disc, and an external memory device such as a memory or an SD card.
- the memory unit (CLME) stores information necessary for drawing, such as analysis condition (CLMP) and drawing setting information (CLMT).
- the analysis condition (CLMP) stores the conditions set by the user (US), such as the number of members to be analyzed and the selection of analysis method.
- the drawing setting information (CLMT) stores the information about the drawing position, namely, what is plotted and which part of the drawing.
- the memory unit (CLME) may also store programs to be executed by a CPU (not shown) of the controller (CLCO).
- the controller includes the CPU (not shown) to perform communication control, input of the analysis conditions from the user (US), drawing of the analysis results to be presented to the user (US) and the like. More specifically, the CPU executes the programs stored in the memory unit (CLME) to perform communication control (CLCC), analysis condition setting (CLIS), drawing setting (CLTS), and organizational activity display (BMF), or other processes.
- CLCC communication control
- CLIS analysis condition setting
- CLTS drawing setting
- BMF organizational activity display
- the communication control controls the timing of wired or wireless communication with the application server (AS) or the sensor network server (SS). Further, the communication control (CLCC) converts the format of the data, and distributes the data to appropriate destinations according to data types.
- the analysis condition setting (CLIS) receives analysis conditions specified by the user (US) through the input/output unit (CLIO), and stores in the analysis condition (CLMP) of the memory unit (CLME).
- CLMP analysis condition
- CLME memory unit
- the period of the data used for analysis, member, type of analysis, and parameter for analysis, or other conditions are set.
- the client (CL) requests an analysis by transmitting the settings to the application server (AS), while performing the drawing setting (CLTS).
- the drawing setting calculates a method to display analysis results based on the analysis condition (CLMP), as well as plotting positions.
- the results of this process are stored in the drawing setting information (CLMT) of the memory unit (CLME).
- the organizational activity display (BMF) generates charts by plotting the analysis results obtained from the application server (AS). For example, the organizational activity display (BMF) plots a display like a radar chart, a time-series graph, and an organizational structure display, as shown in the organizational activity display (BMF) of FIG. 1C . At this time, the organizational activity display (BMF) also displays the attribute such as the name of the displayed person, if necessary.
- the generated display result is presented to the user (US) through the output device such as the display (CLOD).
- the user (US) can finely adjust the display position by an operation such as drag and drop.
- FIGS. 3A to 3E are external views showing an example of the configuration of the name-tag type sensor node.
- the name-tag type sensor node has a strap attachment portion NSH to which a neck strap or a clip is attached.
- the name-tag type sensor node is worn on the neck or chest of the user.
- FIG. 3A is a top view
- FIG. 3B is a front view
- FIG. 3C is a bottom view
- FIG. 3D is a back view
- FIG. 3E is a left side view.
- a liquid crystal display device (LCDD) is provided on the front side of the name-tag type sensor node.
- the liquid crystal display device displays the content of display B as a name tag with the department and name of the wearer, which will be describe below.
- the liquid crystal display device displays the organizational activity feedback data for the wearer.
- the material of the surface of the name-tag type sensor node is transparent, so that an inserted card CRD can be seen from the outside through the case material.
- the design of the name-tag surface can be changed by replacing the card (CRD) inserted into the name-tag type sensor node.
- the name-tag type sensor node according to the present invention can be worn by a person in the same manner as in common name tags, allowing for obtaining physical quantities by sensors without bringing discomfort to the wearer.
- the LED lamps LED 1 , LED 2 are used for notifying the wearer and the person facing the wearer, of the state of the name-tag type sensor node.
- the lights of LED 1 and LED 2 are guided to the front side and the top side, respectively. The lighting state can be seen both from the wearer of the name-tag type sensor node and from the person facing the wearer.
- the name-tag type sensor node includes the speaker SP.
- the speaker SP is used for notifying the wearer and the person facing the wearer, of the state of the name-tag type sensor node by buzzer or voice.
- the microphone MIC obtains the speech of the wearer of the name-tag type sensor node as well as the sound around the wearer.
- the luminance sensors LS 1 F, LS 1 B are provided on the front and back of the name-tag type sensor node, respectively. From the luminance values obtained by the LS 1 F and LS 1 B, it is detected that the name-tag type sensor node of the wearer is reversed, which is notified to the wearer.
- buttons of BTN 1 , BTN 2 , BTN 3 are provided on the left side of the name-tag type sensor node. These buttons are used for changing the operation mode of wireless communication, and switching the liquid crystal display.
- a power switch PSW In the bottom side of the name-tag type sensor node, there are provided a power switch PSW, a reset button RBTN, a cradle connector CRDIF, and an external expansion connector EXPT.
- plural infrared transceivers TRIR 1 to TRIR 4 In the front side of the name-tag type sensor node, there are provided plural infrared transceivers TRIR 1 to TRIR 4 .
- the provision of plural infrared transceivers is specific to the name-tag type sensor node.
- the infrared transceiver intermittently transmits the identification number (TRMD) of the own name-tag type sensor node by infrared radiation.
- Another function of the infrared transceiver is to receive the identification number transmitted by the name-tag type sensor node worn by the other person. In this way, the facing state is recorded about which name-tag type sensor node does and when it occurs. Thus, it is possible to detect the state of face-to-face communication between the wearers.
- four infrared transceivers TRIR 1 to TROR 4 are provided in the upper portion of the name-tag type sensor node.
- FIG. 4A shows the positional relationship between two persons HUM 3 , HUM 4 communicating face-to-face. It rarely happens that two persons are perfectly in front of each other. They often stand diagonally opposite to each other at about shoulder width. At this time, the facing state between the name-tag type sensor nodes may not be detected if the infrared transceivers have sensitivity only in the front of the name tag. It is necessary to have sensitivity at angles of about 30 degrees left and right relative to lines L 4 , L 6 drawn from the surfaces of name-tag type sensor nodes NN 2 , NN 3 worn by HUM 3 , HUM 4 , respectively.
- FIG. 4B shows the positional relationship when the person HUM 1 sitting in a chair and the person HUM 2 standing are communicating with each other. There is a difference in height position of the head between the person sitting in the chair and the person standing, so that the upper body of the person HUM 1 sitting in the chair is slightly tilted upward.
- a line L 3 connecting name-tag type sensor nodes NN 10 and NN 11 worn by HUM 1 and HUM 2 is located below lines L 1 , L 2 drawn from the respective name-tag surfaces. Under this condition, the two name-tag type sensor nodes should have downward sensitivity in order to reliably detect the facing state.
- FIG. 4C shows an example of the placement of the infrared transceivers TRIR 1 to TRIR 4 .
- the infrared transceivers TRIR 1 and TRIR 4 which are provided outside, are placed at an angle of 15 degrees outward in the horizontal direction.
- the infrared transceivers TRIR 2 and TRIR 3 which are provided inside, are placed at an angle of 15 degrees outward in the horizontal direction and at an angle of 30 degrees downward in the vertical direction.
- the infrared transceiver itself has sensitivity at an angle of about ⁇ 15 degrees. As a result, this placement realizes the sensitivity of 45 degrees downward, 15 degrees upward, and ⁇ 30 degrees left and right of the name tag in total. This makes it possible to reliably obtain the state of face-to-face communication between persons. It is needless to say that the number and angle of the infrared transceivers TRIR 1 to TRIR 4 are not limited to the placement in this example.
- the triaxial acceleration sensor (ACC) mounted on the name-tag type sensor node can detect movements of the wearer. At the same time, the triaxial acceleration sensor (ACC) can detect the orientation of the name-tag type sensor node by detecting the acceleration of gravity.
- FIG. 5 shows the axes of the triaxial acceleration detected by the name-tag type sensor node of this example.
- the example defines that the acceleration applied in the horizontal direction of the name-tag type sensor node is X axis, the acceleration applied in the vertical direction is Y axis, and the acceleration applied in the cross direction is Z axis.
- the Y axis is positive in the downward direction of the name-tag type sensor node.
- the force of gravity acts on the bottom side of the name tag (in the Y axis direction), and thus 1 G is detected in the Y axis.
- the acceleration detected in the Y axis is a value smaller than 1 G. The value is negative when the name-tag type sensor node is completely turned upside down.
- a node up/down detection circuit UDDET monitors whether the static acceleration applied to the Y axis is 1 G or a smaller value, in order to detect whether the name-tag type sensor node is facing the wearer or the other person.
- the name-tag type sensor node changes the content to be displayed on the liquid crystal display device based on a detection result, namely, an up/down detection signal UDDETS in FIG. 2A .
- the liquid crystal display device LCDD displays the personal data including the name and the department.
- the buttons are assigned in the following way:
- the wireless transmission interval is extended to suppress the power consumption by pressing the button 3.
- the extended wireless transmission interval is returned to normal mode when the pressing the button 3.
- the liquid crystal display device LCDD displays a name-tag state display screen (Status screen), a display screen of infrared communication history (IrAct screen), an organizational performance input screen (Rating screen), and a name tag setting screen (Option screen).
- the buttons are assigned in the following way:
- Button 3 paging button
- Reference numeral D 101 denotes an example of the display screen when the name-tag type sensor node is not upside down (and is facing the other person). It functions as a name tag with the department and name of the wearer displayed thereon.
- the Status screen is a display screen displaying the operation of the name-tag type sensor node, such as the communication state with the base station and the detected infrared ID.
- the screen is changed to the IrAct screen (D 120 ), Message screen (D 130 ), Rating screen (D 140 ), and Option screen (D 150 ) each time the button 3 is pressed as the paging button.
- the display screen is returned to the Status screen when the button 3 is pressed in the Option screen.
- the IrAct screen (D 120 ) is a screen displaying the number of times the infrared radiation is received from the persons with whom the wearer has communicated face-to-face in the day.
- the infrared reception number is the information about the face-to-face communication time. This information shows that the larger the value, the longer the face-to-face communication time.
- Information for three persons is displayed on the screen at a time.
- the screen is scrolled to display the information one by one (D 121 , D 122 ) each time the button 1 is pressed. Further, the screen shifts to a mode of hourly display when the button 2 is pressed in the IrAct screen. In this case, the screen can also be scrolled by the button 1.
- the button 2 is pressed again in the hourly display (D 125 , D 126 )
- the screen returns to the daily display (D 120 , D 121 , D 122 ).
- the Message screen is a screen for transmitting a message to a specific name-tag type sensor node from the application or from the other name-tag type sensor node, and transmitting a response to the message (D 130 ).
- the wearer inputs subjective evaluations of the organizational performances at an arbitrary time.
- the performances in terms of health state (Health), mental state (Mental), and motivation to study (Study) are rated in five grades.
- the input ratings are transferred to the application server (AS), and are used for the correlation coefficient study (BMD) of the organizational activity analysis (BME).
- the above described name-tag type sensor node of this example includes a secondary battery, in combination with a cradle as a means of charging the built-in secondary battery.
- an external power supply unit does not necessarily have the configuration of a cradle, as a means of supplying power from the outside to the name-tag type sensor node.
- power may be supplied directly from an AC adaptor.
- FIG. 7 shows an example of the configuration between a cradle CRD and the name-tag type sensor node NN.
- a cradle connection interface CRDIF is provided in the bottom of the name-tag type sensor node NN.
- the cradle connection interface CRDIF is connected to a connection interface CCRDIF on the side of the cradle CRD, and then power is supplied.
- the cradle CRD does not include a battery, so that the power is constantly supplied from the AC adaptor and the like.
- the name-tag type sensor node is used in the office environment. For this reason, the name-tag type sensor node is assumed to be charged at night by attaching to the cradle after office hours. However, some workplaces have a rule that the last person turns off a breaker to shut the power off in the room. In this case, no power is supplied to the cradle at night, so that the name-tag type sensor node is not charged.
- FIG. 7 shows the battery for cradle CRDBATT.
- FIG. 8 shows the connection relationship among the name-tag type sensor node NN, the cradle CRD, and the battery for cradle CRDBATT.
- the name-tag type sensor node NN is charged with power from the outside through an EPOW+terminal and an EPOW ⁇ terminal of the interface CRDIF with the cradle CRD.
- the cradle CRD supplies the power supplied from ADP+ and ADP ⁇ , to the EPOW+terminal and EPOW ⁇ terminal of the name-tag type sensor node NN. In this way, the built-in secondary battery of the name-tag type sensor node is charged.
- the battery for cradle CRDBATT is supplied with power from the AC adaptor (ACADP) and the like through AC+ and AC ⁇ of an external power supply terminal ACIF. Then, the battery for cradle CRDBATT charges the built-in secondary battery.
- the cradle CRD is directly supplied with the power through the cradle interface CRDBATTIF. In this way, the power of the built-in secondary battery is continued to be supplied to the cradle CRD, after the power supply from the AC adaptor ACADP is shut off. With this configuration, the built-in secondary battery of the name-tag type sensor node is reliably charged, thereby preventing missing data due to battery exhaustion.
- FIG. 9 shows a specific example of the hardware configuration of the name-tag type sensor node NN shown in FIG. 2A .
- the hardware of the name-tag type sensor node NN is roughly divided into a power supply unit NN 1 P and a main body NN 1 M.
- the power supply unit NN 1 P includes a built-in secondary battery BATT, a regulator REG, a power switch PSW, and an external power detection circuit PDET.
- the power supply unit NN 1 P stabilizes the power from the secondary battery BATT by the regulator REG, and supplies the power to the main body NN 1 M through BATPOW+, PBTPAW ⁇ .
- the power supply unit NN 1 P includes the cradle interface CRDIF.
- the external power detection circuit PDET detects that the power is supplied from the outside, and notifies the main body by the external power detection signal PDETS.
- the power of the name-tag type sensor node can be turned on/off by the power switch PSW. Even when the power is turned off, the secondary battery is charged by attaching the name-tag type sensor node to the cradle.
- the power is supplied from the cradle.
- the function and configuration of the power supply NN 1 P is the same when the power is supplied directly from the AC adaptor and the like, to the name-tag type sensor node.
- the PDETS signal is connected to a general purpose IO port PIO of the main body NN 1 M of the name-tag type sensor node. With this configuration, the main body of the name-tag type sensor node can recognize whether the power is supplied from the external power supply unit.
- the name-tag type sensor node is mainly controlled by a microcomputer MCU of the main body NN 1 M.
- the microcomputer MCU is a large scale integrated circuit LSI that integrates various peripheral functions through an internal bus IBUS, in addition to a central processing unit CPU.
- Examples of typical peripheral functions incorporated in the microcomputer are a serial interface, an A/D converter, a memory, a timer, and a general purpose IO port.
- This example shows a microcomputer integrating three-channel serial interfaces (SIO 0 , SIO 1 , SIO 2 ), an A/D converter (ADC), a timer (TIMR), a general purpose IO port (PIO), a random access memory (RAM), and a flash memory (FLSH).
- the name-tag type sensor node NN converts the information obtained from the various sensors into digital values by the A/D converter ADC. Then, the name-tag type sensor node NN stores the digital values into the memory unit STRG together with the face-to-face communication information obtained by the infrared transceivers TRIR 1 to TRIR 4 , while transmitting the data to the base station through a wireless communication circuit RF. Further, the name-tag type sensor node NN analyzes the data obtained from the sensors, and displays the results on the display device LCDD. The display device LCDD is controlled by the general purpose IO port PIO through LCDIF. The name-tag type sensor node NN further includes an expansion port EXPT capable of inputting/outputting analog and digital values for possible future expansion. The expansion port EXPT includes analog input/output terminals EXAD 0 , EXAD 1 , in addition to a signal EXTIO for the general purpose IO port.
- the triaxial acceleration sensor ACC, the microphone MIC for obtaining voice, the temperature sensor THM, and the luminance sensors LS 1 F, LS 1 B are all connected to the A/D converter ADC.
- the A/D converter ADC has six input channels (AD 0 to AD 5 ).
- the channels AD 4 and AD 5 can also be used as D/A converters.
- the A/D converter receives data from various sensors.
- the A/D converter is connected to analog input/output terminals EXTAD 0 , EXTAD 1 of the external expansion port EXPT, and is also connected to a terminal voltage BATDETS of the secondary battery BATT to detect exhaustion of the secondary battery BATT.
- the number of input ports of the A/D converter is limited while being connected to a wide variety of sensors.
- the number of ports is also insufficient for the number of sensors.
- one port is used in a time-sharing manner to allow for A/D conversion of the desired sensor information.
- the amount of change is significant and the frequency of acquisition is high.
- the data is independently assigned to AD 0 to AD 2 .
- the voice input from the microphone MIC is amplified in an input amplifier IAMP, and passes through a low pass filter LPF to cut frequency elements exceeding the Nyquist frequency. Then, it is obtained data SNDD as the real voice, as well as energy AVGD integrated by the integrating circuit AVG.
- An analog switch SELL selects between the data SNDD and the energy AVGD, and inputs to the channel AD 3 .
- An analog switch SEL 2 selects among the external input signal EXTAD 0 from the external expansion port EXPT, the terminal voltage BATDETS of the secondary battery BATT, and the voice output to the speaker. Then, the analog switch SEL 2 connects the selected signal to the channel AD 4 .
- the voice output to the speaker is amplified by an output amplifier OAMP to drive the speaker SP.
- An analog switch SEL 3 selects among data THMD obtained by the temperature sensor THM, data LS 1 FD, LS 1 BD obtained by the luminance sensors LS 1 F, LS 1 B, and the external input signal EXTAD 1 from the external expansion port EXPT. Then, the analog switch SEL 3 inputs the selected signal to the channel AD 5 .
- the analog switches SEL 1 , SEL 2 , SEL 3 provided in the A/D converter are controlled by an ADSEL signal output from the general purpose IO port PIO.
- the wireless communication circuit RF communicates with the microcomputer MCU through RFIF which is a serial communication. Because the amount of communication data is large and the usage frequency is high, a serial interface channel 0 (SIO 0 ) is exclusively assigned to the wireless communication circuit RF. Further, the infrared transceivers should be kept ready to perform a waiting operation, in order to receive the ID from the other name-tag sensor node and obtain face-to-face communication information. Thus, the infrared transceivers are connected to a serial port channel 1 (SIO 1 ).
- the transmission circuits of the four infrared transceivers TRIR 1 to TRIR 4 are driven by a channel 1 serial transmission signal SIO 1 TxD which is common to all the transmission circuits. With respect to the reception, the receivers of the four infrared transceivers TRIR 1 to TRIR 4 are ORed (IROR 1 ), and connected to a channel 1 serial reception signal SIO 1 RxD.
- a signal STRGIF is for the memory unit STRG.
- a signal RTCIF is for a real time clock RTC to obtain absolute time.
- a communication means EXTSIO communicates with the cradle. These signals also use a serial communication interface, but the usage frequency is limited.
- a serial port channel 2 (SIO 2 ) is used in a time-sharing manner. At this time, a selector SSEL 2 switches the signals by a signal SIO 2 SEL output from the general purpose IO port PIO.
- the operation timing of the CPU of the name-tag type sensor node is determined by the following factors: the time information from the real time clock RTC; when the voice obtained by the microphone MIC exceeds certain energy; and when input from the buttons (BTN 1 , BTN 2 , BTN 3 ) is received. These factors can generate interrupt signals RTCINT, SNDINT, BTNINT to the CPU, respectively.
- a comparator CMP 1 detects that energy AVGD of the voice exceeds a predetermined value, and generates the interrupt signal SNDINT to the CPU.
- the button inputs from BTN 1 , BTN 2 , BTN 3 can be obtained by the general purpose IO port (PIO) through BTN 1 IF, BTN 2 IF, BTN 3 IF, respectively. Further, an OR circuit (OR 2 ) detects the input change and generates the button interrupt signal BTNINT.
- the CPU can be reset through a reset interface RSTS.
- the name-tag type sensor node cradle of this example is roughly divided into a power supply unit CRD 1 P and a main body CRD 1 M.
- the power supply unit CRD 1 P includes a charging circuit CHG for charging the built-in secondary battery of the name-tag type sensor node through the cradle interface CCRDIF from an external power supply, and a regulator CREG for stabilizing the power for the operation of the cradle itself.
- the cradle is supplied with power through terminals ADP+, ADP ⁇ of a cradle battery interface CBATIF, by means of an external power supply such as the AC adaptor, or the battery for cradle described below.
- the power supplied through ADP+, ADP ⁇ is stabilized by the regulator CREG, and is supplied to the cradle main body CRD 1 M through CPOW+, CPOW ⁇ .
- the power is used for charging the secondary battery of the name-tag type sensor node through EPOW+, EPOW ⁇ of an interface CRDCRDIF between the cradle and the name-tag type sensor node, from the charging circuit CHG.
- the main body CRD 1 M of the name-tag type sensor node cradle has a sensor node circuitry including: a wireless communication circuit CRF for performing wireless communication; a microcomputer CMCU for controlling the wireless communication circuit CRF; infrared transceivers CTRIR 1 and CTRIR 2 ; a real time clock CRTC; and a memory unit CSTRG.
- the wireless communication circuit CRF, the microcomputer CMCU for controlling the wireless communication circuit CRF, and the infrared transceivers CTRIR 1 and CTRIR 2 correspond to the wireless communication circuit RF, the microcomputer MCU for controlling the wireless communication circuit RF, and the infrared transceivers TRIRI 1 to TRIR 4 in the name-tag type sensor node, respectively.
- the cradle serves to supply power to an object attached thereto, and generally has only the power supply unit.
- the sensor node circuitry incorporated in the main body CRD 1 M is specific to this example.
- the cradle is mainly controlled by the microcomputer CMCU of the main body.
- the microcomputer CMCU is an LSI that integrates various peripheral functions through an internal bus CIBUS, in addition to a central processing unit CCPU.
- the microcomputer integrates three-channel serial interfaces (CSIO 0 , CSIO 1 , CSIO 2 ), a timer (CTIMR), a general purpose IO port (CPIO), a random access memory (CRAM), and a flash memory (CFLSH).
- the wireless communication circuit CRF communicates with the microcomputer CMCU through CRFIF which is a serial communication. Because the amount of communication data is large and the usage frequency is high, a serial interface channel 0 (CSIO 0 ) is exclusively assigned to the wireless communication circuit CRF. Further, the infrared transceivers CTRIR 1 and CTRIR 2 should be kept ready to perform a waiting operation, in order to receive the ID from the other name-tag sensor node and obtain face-to-face communication information. Thus, the infrared transceivers are connected to a serial port channel 1 (CSIO 1 ).
- CSIO 1 serial port channel 1
- transmission circuits of the two infrared transceivers CTRIR 1 and CTRIR 2 are driven by a channel 1 serial transmission signal CSIO 1 TxD which is common to all the transmission circuits.
- the receivers of the two infrared transceivers CTRIR 1 and CTRIR 2 are ORed (CIROR), and connected to a channel 1 serial reception signal CSIO 1 RxD.
- a signal CSTRGIF is for the memory unit CSTRG.
- a signal CRTCIF is for the real time clock CRTC to obtain absolute time.
- a communication means CEXTSIO communicates with the cradle. These signals also use a serial communication interface, but the usage frequency is limited. Thus, a serial port channel 2 (CSIO 2 ) is used in a time-sharing manner. At this time, a selector CSSEL 2 switches the signals by a signal CSIO 2 SEL output from the general purpose IO port PIO.
- the cradle of this example has a function of detecting that the cradle is facing the name-tag type sensor node, and wirelessly transmitting the information.
- the cradle is assumed to be placed on a desk. When a person wearing the name-tag type sensor node sits in front of the desk, an infrared communication is performed between the name-tag type sensor node and the cradle. In this way, information is recorded about who is sitting in that place and when he/she is.
- the hardware of the battery for cradle includes a secondary battery BATTS 2 , and a circuit BCHG for charging the secondary battery BATTS 2 .
- the cradle is supplied with power through BBATIF having two terminals ADP+, ADP ⁇ .
- BBATIF having two terminals ADP+, ADP ⁇ .
- the secondary battery BATTS 2 includes batteries connected in two-parallel and two-series arrays. Each battery is the same as the secondary battery incorporated in the name-tag type sensor node. Logically, the secondary battery incorporated in the name-tag type sensor node would be charged with the power of one battery. However, the secondary battery is generally charged with a high voltage. For this reason, it is designed to use a series connection to gain voltage, as well as a two-parallel connection to store a sufficient amount of power.
- the name-tag type sensor node obtains physical quantities necessary to calculate organizational activities.
- the name-tag type sensor node displays the output on the name tag while transmitting to the base station.
- the name-tag type sensor node is used by a person wearing it, and preferably small and lightweight. Thus, it is necessary to have a small battery for operating the name-tag type sensor node.
- the acquisition of the physical quantities from the sensors and the transmission operation of the sensor data from the sensors are intermittently performed.
- FIGS. 12A to 12F show an example of the timing of obtaining sensor data as physical quantity data from the sensors, the timing of transmitting the sensor data, and the timing of writing the sensor data into the memory unit.
- FIG. 12A shows the timing of obtaining voice from the microphone by the A/D converter.
- FIG. 12B shows the timing of obtaining acceleration by the A/D converter at a constant interval of TNS 2 .
- FIG. 12C shows the timing of obtaining temperature and luminance by the A/D converter.
- the acceleration is obtained at an interval of time TSN 2 .
- the temperature and luminance are obtained at an interval of time TSN 3 .
- voice is the largest in terms of physical quantity per unit of time, followed by acceleration, temperature, and luminance.
- the timing of sampling of the physical quantities is arbitrary depending on the sensor type, and the magnitude of the timing interval is not limited.
- the magnitude of the timing interval is not limited.
- FIG. 12D shows the timing of wirelessly transmitting the packet.
- a data set SENSD 1 which contains 4 data pieces of voice, 2 data pieces of acceleration, and 1 data piece of temperature and of luminance, is wirelessly transmitted as a packet TRD 1 .
- data sets SENSD 2 , SENSD 3 are wirelessly transmitted as packets TRD 2 , TRD 3 , respectively.
- the wireless transmission interval is not necessarily constant.
- FIG. 12D shows an example of the timing of wireless transmission at a constant interval of time TTR 1 .
- FIG. 12E shows the timing of storing the physical quantity data obtained by the sensors into the memory unit.
- the data sets SENSD 2 , SENSD 3 are stored in the memory unit as data CMBD 2 and CMBD 3 , respectively.
- the frequency and interval of the writing timing to the memory unit are not limited in the present invention.
- FIG. 12F shows the state of the external power detection signal PDETS for detecting the connection of the name-tag type sensor node to the external power supply.
- PDETS indicates a high level
- the name-tag type sensor node is attached, for example, to the cradle and supplied with power from the outside to charge the secondary battery.
- the wireless communication may not be normally performed. This is the same in the case of communication between the name-tag type sensor node and the base station. There might be a possibility that the communication is not normally completed, for example, when there is no base station near the name-tag type sensor node.
- the packets TRD 2 , TRD 4 , TRD 5 are not normally transmitted. Even if the transfer to the base station is failed, the data is not missing because the data CMBD 2 , CMBD 4 , CMBD 5 corresponding to TRD 2 , TRD 4 , TRD 5 obtained from the sensors, are still stored in the memory unit. The data is retransferred to the base station after communication is recovered. Finally, it is possible to prevent missing data to be collected. The retransmission of the data stored in the memory unit to prevent missing data, is called bulk transmission.
- the bulk transmission is performed when the name-tag type sensor node is connected to the external power supply. This sequence is specific to this example.
- FIG. 12F shows that the PDETS signal is changed to a high level at a timing T 1 and the external power supply is connected to the name-tag type sensor node.
- the name-tag type sensor node starts the bulk transmission upon detection of the PDETS signal.
- a packet TRD 2 R generated from the data CMBD 2 stored in the memory unit is transferred after transfer of the packet TRD 6 .
- the packet TRD 2 R corresponds to the packet TRD 2 that has failed to be transferred.
- TRD 5 R as retransmission data of the packet TRD 5 are transferred after transfer of the packets TRD 7 , TRD 8 , respectively.
- the transfer interval TTR 1 is changed to TTR 2 which is shorter than TTR 1 . This is done, as described above in FIG. 2A , by switching the time bases TB 1 , TB 2 by TMGSEL in the transfer timing controller TRTMG, and by switching the data to be communicated by appropriately controlling the communication data selector TRDSEL.
- the data is continuously obtained from the sensors after the external power supply is connected.
- the data acquisition from the sensors may be interrupted for bulk transmission. Also in this case, the bulk transmission can be effectively performed by reducing the transfer interval.
- the operation software of the name-tag type sensor node is called firmware.
- the firmware is stored in the flash memory FLSH incorporated in the microcomputer MCU shown in FIG. 9 .
- the flash memory is rewritten by dedicated hardware provided by the manufacture of the microcomputer.
- such hardware does not support simultaneous rewriting of plural flash memories.
- the name-tag type sensor node is worn by an individual to recognize the activities of the organization.
- the number of name-tag type sensor nodes is equal to the number of wearers. It takes a lot of time and unrealistic to collect all the name-tag type sensor nodes for each firmware update and to rewrite the flash memories one by one.
- the name-tag type sensor node NN has a function of wirelessly transferring the firmware to be updated and updating such firmware. This sequence will be described with reference to the drawings.
- the name-tag type sensor node firmware SSTF to be updated is registered in the memory unit SSME of the sensor network server SS.
- the registration means TFI is not so limited, for example, the firmware may be transferred by FTP through a network NW.
- FIGS. 13A to 13D show the flow of data and the timing. It is shown that the data is obtained by the sensors, transferred from the name-tag type sensor node to the base station, and transferred to the sensor network server SS.
- FIG. 14 is a process flowchart of the name-tag type sensor node.
- the name-tag type sensor node searches the base station and performs connection operation called associate (P 117 ). Then, the name-tag type sensor node performs time adjustment process to synchronize the sensing time with the other name-tag type sensor node (P 118 ). Next, as described above, the name-tag type sensor node obtains sensor information in an intermittent manner (P 102 ), transmits the sensor data (P 103 ), and stores the sensor data into the memory unit (P 104 ). The process is performed for each interval of TTR 3 (P 101 ), unless the name-tag type sensor node is attached to the cradle (P 105 ).
- 13A to 13D show an example of the flow of sensor data SND 10 , SND 11 , SND 12 , SND 13 , SND 14 , and SND 15 that is transmitted from the sensor node of FIG. 13C to the base station of FIG. 13B at an interval of TTR 3 .
- the base station normally receives the sensor data, and transfers the sensor data as SND 21 , SND 22 , SND 24 , and SND 25 to the sensor network server (SS) of FIG. 13A .
- sensor data SND 10 , SND 13 , SND 14 indicated by the dotted lines is not normally transferred to the base station due to disturbance or other problems, and thus not transferred to the sensor network server. This is repeated during the intermittent sensing operation (TP).
- FIG. 13D shows an example of the case in which the name-tag type sensor node is connected to the external power supply by attaching the name-tag type sensor node to the cradle, or other means.
- the name-tag type sensor node moves to the bulk transmission operation (TC).
- the name-tag type sensor node reads sensor data not normally transmitted, of the sensor data stored in the memory unit (P 107 ), and transmits to the base station (P 108 ). This is repeated until all the data is transmitted (P 109 ) at an interval of TTR 4 ( 106 ).
- TTR 4 ⁇ TTR 3 which is a feature of the present invention. Because the name-tag type sensor node is attached to the cradle, the power supply is stable and the radio wave environment is good. With these conditions, it is possible to effectively transfer a large amount of data represented by bulk transmission data, without exhausting the battery.
- the name-tag type sensor node moves to an operation of transferring update firmware (TF).
- the name-tag type sensor node first inquires the sensor network server whether update firmware is registered in the server, through the base station (P 110 ). More specifically, the name-tag type sensor node transmits a query packet (FDRQ 1 ) to the base station. Then, the base station further transmits a query packet (FDRQ 2 ) to the sensor network server.
- the operation is arbitrary until the name-tag type sensor node is removed from the cradle. The normal intermittent sensing operation may be performed, or the operation may be stopped.
- the flow of FIG. 14 shows an example of stopping the operation (P 116 ).
- the update firmware is transferred to the name-tag type sensor node from the sensor network server through the base station (P 113 ).
- a packet FD 2 is transferred from the sensor network server to the base station, and a packet FD 2 is transferred from the base station to the name-tag type sensor node. This is repeated at an interval of TTR 5 (P 112 ) until all the update firmware is transferred (P 114 ).
- TTR 5 ⁇ TTR 3 which is a feature of the present invention. Because the name-tag type sensor node is attached to the cradle, the power supply is stable and the radio wave environment is good. With these conditions, it is possible to effectively transfer the data desired to be reliable, which is represented by update firmware, without exhausting the battery.
- FIGS. 15A to 15 K show the relationship among the sensor information acquisition timing, the wireless transmission/reception timing, and the infrared transmission/reception timing, when a name-tag type sensor node 1 and a name-tag type sensor node 2 face each other.
- FIG. 15A shows the timing of obtaining acceleration and voice.
- FIG. 15B shows the timing of wirelessly transmitting/receiving the obtained sensor information.
- FIG. 15C shows the timing that an infrared transceiver 1 of the name-tag type sensor node 1 transmits its own ID.
- FIGS. 15D , 15 E, 15 F show the timing of transmissions from infrared transceivers 2 , 3 , 4 of the name-tag type sensor node 1 .
- the receiver waits for at least one transmission timing interval. More specifically, as shown in FIG. 15G , the receiver of the infrared transceiver 1 of the name-tag type sensor node 2 performs a waiting operation (IRRT 1 ) during a period when the infrared transceivers 1 to 4 of the name-tag type sensor node 1 transmit at least once, namely during RTT 1 , IRTT 2 , IRTT 3 , and IRTT 4 . This is the same for the infrared transceivers 2 , 3 , 4 of the name-tag type sensor node 2 . As shown in FIGS. 15H , 15 I, 15 J, the receivers can perform the waiting operation in a time-sharing manner (IRRT 2 , IRRT 3 , IRRT 4 ).
- the waiting state of the infrared receiver increases the power consumption.
- the intermittent operation is performed to reduce the power consumption of the infrared receiver to one fourth or less.
- the name-tag type sensor nodes are synchronized with each other, there is no need to constantly perform the infrared transmission and reception.
- the infrared transmission and reception are performed at an interval IRNIT 1 .
- MPU of the name-tag type sensor nodes 1 and 2 is switched from a normal operation mode (MPUMD 1 ) to a low power consumption state (MPUMD 2 ) in order to reduce the power consumption.
- the cradle is placed on the desk and the name-tag type sensor node is attached to the cradle after office hours.
- the power supply is stable, thereby ensuring stable communication.
- the name-tag type sensor node attached to the cradle when the name-tag type sensor node attached to the cradle, a large amount of data, such as bulk transmission data, is transferred at an increased communication frequency.
- the data desired to be reliable such as rewriting data of the firmware of the name-tag type sensor node, is transferred. This allows for the data transfer without exhausting the battery of the name-tag type sensor node and without unnecessarily compressing the communication bandwidth.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Arrangements For Transmission Of Measured Signals (AREA)
- Selective Calling Equipment (AREA)
- Mobile Radio Communication Systems (AREA)
- Optical Communication System (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
Description
where,
[Formula 2]
p 1 =a 1 X 1 +a 2 X 2 + . . . +a m X m (2)
[Formula 3]
p 1 =a 1 x 1 +a 2 x 2 + . . . +a m x m (3)
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/370,788 US20120139750A1 (en) | 2007-05-30 | 2012-02-10 | Sensor node |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2007-143642 | 2007-05-30 | ||
JP2007143642A JP5010985B2 (en) | 2007-05-30 | 2007-05-30 | Sensor node |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/370,788 Continuation US20120139750A1 (en) | 2007-05-30 | 2012-02-10 | Sensor node |
Publications (2)
Publication Number | Publication Date |
---|---|
US20080297373A1 US20080297373A1 (en) | 2008-12-04 |
US8138945B2 true US8138945B2 (en) | 2012-03-20 |
Family
ID=40087538
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/153,689 Expired - Fee Related US8138945B2 (en) | 2007-05-30 | 2008-05-22 | Sensor node |
US13/370,788 Abandoned US20120139750A1 (en) | 2007-05-30 | 2012-02-10 | Sensor node |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/370,788 Abandoned US20120139750A1 (en) | 2007-05-30 | 2012-02-10 | Sensor node |
Country Status (2)
Country | Link |
---|---|
US (2) | US8138945B2 (en) |
JP (1) | JP5010985B2 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090156195A1 (en) * | 2007-12-18 | 2009-06-18 | Humblet Pierre A | Obtaining time information in a cellular network |
US20090154447A1 (en) * | 2007-12-18 | 2009-06-18 | Humblet Pierre A | Absolute time recovery |
US20130103643A1 (en) * | 2010-06-18 | 2013-04-25 | Mitsubishi Electric Corporation | Data processing apparatus, data processing method, and program |
US10049336B2 (en) | 2013-02-14 | 2018-08-14 | Sociometric Solutions, Inc. | Social sensing and behavioral analysis system |
US20190166413A1 (en) * | 2017-11-30 | 2019-05-30 | Cniguard Ltd | Monitor for Underground Infrastructure |
Families Citing this family (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2009145187A1 (en) * | 2008-05-26 | 2009-12-03 | 株式会社日立製作所 | Human behavior analysis system |
US8665784B2 (en) * | 2008-09-29 | 2014-03-04 | Stmicroelectronics, Inc. | Web based smart sensor network tracking and monitoring system |
JP2010211360A (en) * | 2009-03-09 | 2010-09-24 | Hitachi Ltd | Electronic apparatus and system using the same |
WO2011055628A1 (en) * | 2009-11-04 | 2011-05-12 | 株式会社日立製作所 | Organization behavior analyzer and organization behavior analysis system |
JP5568767B2 (en) * | 2010-08-06 | 2014-08-13 | 株式会社日立製作所 | Infrared transmission / reception system and infrared transmission / reception method |
JP5537396B2 (en) * | 2010-12-01 | 2014-07-02 | 株式会社日立製作所 | Sensing device and system |
EP2661054B1 (en) | 2010-12-27 | 2020-08-26 | FINEWELL Co., Ltd. | Transmitter/receiver unit |
JP5783352B2 (en) | 2011-02-25 | 2015-09-24 | 株式会社ファインウェル | Conversation system, conversation system ring, mobile phone ring, ring-type mobile phone, and voice listening method |
JP5698614B2 (en) * | 2011-06-22 | 2015-04-08 | インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation | Context information processing system and method |
JP5723014B2 (en) | 2011-09-06 | 2015-05-27 | 株式会社日立製作所 | Sensor terminal |
KR101863831B1 (en) | 2012-01-20 | 2018-06-01 | 로무 가부시키가이샤 | Portable telephone having cartilage conduction section |
JP6043511B2 (en) * | 2012-06-08 | 2016-12-14 | 株式会社日立製作所 | Sensor terminal |
KR20180061399A (en) | 2012-06-29 | 2018-06-07 | 로무 가부시키가이샤 | Stereo earphone |
US9893551B2 (en) | 2012-12-26 | 2018-02-13 | Elwha Llc | Ad-hoc wireless sensor package |
US10491050B2 (en) | 2012-12-26 | 2019-11-26 | Elwha Llc | Ad hoc wireless sensor package |
US9900668B2 (en) * | 2012-12-26 | 2018-02-20 | Elwha Llc | Ad-hoc wireless sensor package |
US10826335B2 (en) | 2012-12-26 | 2020-11-03 | Elwha Llc | Ad-hoc wireless sensor package |
US10230267B2 (en) | 2012-12-26 | 2019-03-12 | Elwha Llc | Ad-hoc wireless sensor package |
JP6275449B2 (en) * | 2013-10-31 | 2018-02-07 | 株式会社ファインウェル | Transmitter / receiver, name tag, and contactless IC card |
KR101877652B1 (en) | 2013-08-23 | 2018-07-12 | 로무 가부시키가이샤 | Portable telephone |
EP3062491B1 (en) | 2013-10-24 | 2019-02-20 | FINEWELL Co., Ltd. | Bracelet-type transmission/reception device and bracelet-type notification device |
JP6271321B2 (en) * | 2014-03-28 | 2018-01-31 | セコム株式会社 | Qualification display device and program |
US9454558B2 (en) | 2014-04-23 | 2016-09-27 | International Business Machines Corporation | Managing an index of a table of a database |
US9483515B2 (en) | 2014-04-23 | 2016-11-01 | International Business Machines Corporation | Managing a table of a database |
US9420536B2 (en) * | 2014-06-10 | 2016-08-16 | Qualcomm Incorporated | Controlling power consumption in peer-to-peer communications |
JP6551919B2 (en) | 2014-08-20 | 2019-07-31 | 株式会社ファインウェル | Watch system, watch detection device and watch notification device |
CN107113481B (en) | 2014-12-18 | 2019-06-28 | 株式会社精好 | Connecting device and electromagnetic type vibration unit are conducted using the cartilage of electromagnetic type vibration unit |
US10967521B2 (en) | 2015-07-15 | 2021-04-06 | Finewell Co., Ltd. | Robot and robot system |
JP6551929B2 (en) | 2015-09-16 | 2019-07-31 | 株式会社ファインウェル | Watch with earpiece function |
KR102108668B1 (en) | 2016-01-19 | 2020-05-07 | 파인웰 씨오., 엘티디 | Pen-type handset |
JP6336689B1 (en) * | 2018-01-30 | 2018-06-06 | 株式会社ウフル | IoT device management system |
JP2020053948A (en) | 2018-09-28 | 2020-04-02 | 株式会社ファインウェル | Hearing device |
CN109558006B (en) * | 2018-11-23 | 2022-11-18 | 武汉灏存科技有限公司 | Wireless distributed limb motion capture device |
JP2020195122A (en) * | 2019-05-30 | 2020-12-03 | 京セラ株式会社 | Communication apparatus and control method for the same |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004178045A (en) | 2002-11-25 | 2004-06-24 | Omron Corp | Alarm system, power supply device and its method, and program |
JP2004235965A (en) | 2003-01-30 | 2004-08-19 | Fuji Photo Film Co Ltd | Recording system for image data at vehicle driving |
US20050235091A1 (en) * | 2004-04-20 | 2005-10-20 | Caph Chen | USB hub with built-in storage device |
US20060229520A1 (en) | 2005-04-08 | 2006-10-12 | Shunzo Yamashita | Controller for sensor node, measurement method for biometric information and its software |
JP2006312010A (en) | 2005-04-08 | 2006-11-16 | Hitachi Ltd | Controller of sensor node, and measuring method for biometric information and program |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5233643A (en) * | 1991-10-08 | 1993-08-03 | Network Access Corporation | Method and system network for providing an area with limited bandwidth bi-direction burst personnel telecommunications |
JP2908334B2 (en) * | 1996-08-12 | 1999-06-21 | 静岡日本電気株式会社 | Individually selected paging receiver |
JPH11298412A (en) * | 1998-04-07 | 1999-10-29 | Minolta Co Ltd | Optical transmission system of device |
US6208247B1 (en) * | 1998-08-18 | 2001-03-27 | Rockwell Science Center, Llc | Wireless integrated sensor network using multiple relayed communications |
JP4116212B2 (en) * | 1999-12-28 | 2008-07-09 | 株式会社東芝 | COMMUNICATION DEVICE AND ITS CONTROL METHOD |
JP2003178348A (en) * | 2001-12-07 | 2003-06-27 | Birukon Kk | Paper sheet discriminating counter and discriminating counting method |
JP4469136B2 (en) * | 2003-04-03 | 2010-05-26 | パナソニック株式会社 | Communication terminal device and wireless communication method |
DE10340346A1 (en) * | 2003-08-29 | 2005-04-28 | Hella Kgaa Hueck & Co | Sensor device, in particular for motor vehicles |
US7362210B2 (en) * | 2003-09-05 | 2008-04-22 | Honeywell International Inc. | System and method for gate access control |
JP4257231B2 (en) * | 2004-02-27 | 2009-04-22 | Necインフロンティア株式会社 | Charger for terminal |
JP2005286502A (en) * | 2004-03-29 | 2005-10-13 | Casio Comput Co Ltd | Portable information equipment, program rewriting method, and program |
US20050265731A1 (en) * | 2004-05-28 | 2005-12-01 | Samsung Electronics Co.; Ltd | Wireless terminal for carrying out visible light short-range communication using camera device |
JP2006013811A (en) * | 2004-06-24 | 2006-01-12 | Matsushita Electric Ind Co Ltd | Information terminal |
JP4460528B2 (en) * | 2004-12-14 | 2010-05-12 | 本田技研工業株式会社 | IDENTIFICATION OBJECT IDENTIFICATION DEVICE AND ROBOT HAVING THE SAME |
JP2006197177A (en) * | 2005-01-13 | 2006-07-27 | Matsushita Electric Ind Co Ltd | Communication collision preventing method for radio node |
JP2006318188A (en) * | 2005-05-12 | 2006-11-24 | Medical Electronic Science Inst Co Ltd | Mountain climber support system |
JP2007025120A (en) * | 2005-07-14 | 2007-02-01 | Hitachi Ltd | Radio terminal device, character data display method and program |
JP2008287690A (en) * | 2007-04-20 | 2008-11-27 | Hitachi Ltd | Group visualization system and sensor-network system |
-
2007
- 2007-05-30 JP JP2007143642A patent/JP5010985B2/en not_active Expired - Fee Related
-
2008
- 2008-05-22 US US12/153,689 patent/US8138945B2/en not_active Expired - Fee Related
-
2012
- 2012-02-10 US US13/370,788 patent/US20120139750A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004178045A (en) | 2002-11-25 | 2004-06-24 | Omron Corp | Alarm system, power supply device and its method, and program |
JP2004235965A (en) | 2003-01-30 | 2004-08-19 | Fuji Photo Film Co Ltd | Recording system for image data at vehicle driving |
US20050235091A1 (en) * | 2004-04-20 | 2005-10-20 | Caph Chen | USB hub with built-in storage device |
US20060229520A1 (en) | 2005-04-08 | 2006-10-12 | Shunzo Yamashita | Controller for sensor node, measurement method for biometric information and its software |
JP2006312010A (en) | 2005-04-08 | 2006-11-16 | Hitachi Ltd | Controller of sensor node, and measuring method for biometric information and program |
Non-Patent Citations (4)
Title |
---|
Daniel Olguin Olguin, et al., "Wearable Communicator Badge: Designing a New Platform for Revealing Organizational Dynamics," IEEE 10th International Symposium on Wearable Computing (Doctoral Colloquium Proceedings), Montreaux, Switzerland (Oct. 2006) (3 pages). |
Joan Morris DiMicco, et al., "Using Visualizations to Review a Group's Interaction Dynamics," Conference on Human Factors in Computing Systems (CHI), (Apr. 2006); (6 pages). |
Mathew Laibowitz, et al., "A Sensor Network for Social Dynamics", 5th International Conference on Information Processing in Sensor Networks (IPSN), (Apr. 2006); pp. 483-491. |
Peter A. Gloor, et al., "Studying Microscopic Peer-to-Peer Communication Patterns," Americas Conference on Information Systems (AMCIS), (Aug. 2007) (12 pages). |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090156195A1 (en) * | 2007-12-18 | 2009-06-18 | Humblet Pierre A | Obtaining time information in a cellular network |
US20090154447A1 (en) * | 2007-12-18 | 2009-06-18 | Humblet Pierre A | Absolute time recovery |
US8379625B2 (en) * | 2007-12-18 | 2013-02-19 | Airvana Llc | Obtaining time information in a cellular network |
US8520659B2 (en) | 2007-12-18 | 2013-08-27 | Airvana Llc | Absolute time recovery |
US20130103643A1 (en) * | 2010-06-18 | 2013-04-25 | Mitsubishi Electric Corporation | Data processing apparatus, data processing method, and program |
US9146927B2 (en) * | 2010-06-18 | 2015-09-29 | Mitsubishi Electric Corporation | Data processing apparatus, data processing method, and program |
US10049336B2 (en) | 2013-02-14 | 2018-08-14 | Sociometric Solutions, Inc. | Social sensing and behavioral analysis system |
US20190166413A1 (en) * | 2017-11-30 | 2019-05-30 | Cniguard Ltd | Monitor for Underground Infrastructure |
Also Published As
Publication number | Publication date |
---|---|
JP5010985B2 (en) | 2012-08-29 |
JP2008301071A (en) | 2008-12-11 |
US20120139750A1 (en) | 2012-06-07 |
US20080297373A1 (en) | 2008-12-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8138945B2 (en) | Sensor node | |
JP5372588B2 (en) | Organization evaluation apparatus and organization evaluation system | |
US20220000405A1 (en) | System That Measures Different States of a Subject | |
US9111242B2 (en) | Event data processing apparatus | |
JP4977534B2 (en) | Sensor network system and sensor node | |
JP5672934B2 (en) | Sensing data display device and display system | |
JP5202204B2 (en) | Data management system | |
JP2009211574A (en) | Server and sensor network system for measuring quality of activity | |
JP2017208005A (en) | Sensor data analysis system and sensor data analysis method | |
JP5503719B2 (en) | Performance analysis system | |
US20200005211A1 (en) | Information processing system | |
CN109747568B (en) | Method and device for controlling internet of things equipment linked with vehicle | |
JP2010166211A (en) | Data management system and method | |
US9496954B2 (en) | Sensor terminal | |
CN116127082A (en) | Data acquisition method, system and related device | |
JP5506593B2 (en) | Sensor data collection system | |
US20120191413A1 (en) | Sensor information analysis system and analysis server | |
JP2010217939A (en) | Knowledge creation behavior analysis system and processor | |
JP6594512B2 (en) | Psychological state measurement system | |
JP7088397B1 (en) | Data collection system, data collection device, data acquisition device and data collection method | |
JP5568767B2 (en) | Infrared transmission / reception system and infrared transmission / reception method | |
JP2024121850A (en) | Monitoring system, monitoring method and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: HITACHI, LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HAYAKAWA, MIKI;MORIWAKI, NORIHIKO;YANO, KAZUO;AND OTHERS;REEL/FRAME:021047/0485;SIGNING DATES FROM 20080407 TO 20080420 Owner name: HITACHI, LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HAYAKAWA, MIKI;MORIWAKI, NORIHIKO;YANO, KAZUO;AND OTHERS;SIGNING DATES FROM 20080407 TO 20080420;REEL/FRAME:021047/0485 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
ZAAA | Notice of allowance and fees due |
Free format text: ORIGINAL CODE: NOA |
|
ZAAB | Notice of allowance mailed |
Free format text: ORIGINAL CODE: MN/=. |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20240320 |