Visitors Now:
Total Visits:
Total Stories:
Profile image
By Alton Parrish (Reporter)
Contributor profile | More stories
Story Views

Now:
Last Hour:
Last 24 Hours:
Total:

Your Smartphone Knows You Better Than You Know Yourself

Friday, January 4, 2013 14:02
% of readers think this story is Fact. Add your two cents.

(Before It's News)

 

 

 
Mobile devices gather piles of data that you may actually find helpful.

Inside Science Minds (ISM)  presents an ongoing series of guest columnists and personal perspectives presented by scientists, engineers, mathematicians, and others in the science community showcasing some of the most interesting ideas in science today.

 
Credit: Gary M. Weiss

(ISM) — Did you ever wonder what your smartphone knows about you? Or how it learns about you? Wouldn’t it be great if it could tell you things that you don’t even recognize about how you walk, talk and act?

Smartphones are already capable of doing this, and many researchers are dedicated to finding ways to gather and interpret the most useful information. Modern smartphones are packed with many powerful sensors that enable the phone to collect data about you. Although that may alarm anyone who is concerned about privacy, the sensors also present an opportunity to help smartphone users in previously impossible ways. When I realized how much these sensors could tell about a person, I established theWireless Sensor Data Mining (WISDM) Lab at Fordham University in the Bronx, N.Y. The goal of this lab is to apply modern machine learning and data mining methods in order to “mine” knowledge about smartphone users from their sensor data. 

Smartphones contain more sensors than most people would ever imagine. Android phones and iPhones include an audio sensor (microphone), image sensor (camera), touch sensor (screen), acceleration sensor (tri-axial accelerometer), light sensor, proximity sensor, and several sensors (including the Global Positioning System) for establishing location.

Early on we decided to focus our efforts on the tri-axial accelerometer, since we felt that it is one of the most informative — and underutilized — sensors. This sensor measures the phone’s acceleration in all three spatial dimensions as well as its orientation. This enables the phone to adjust the screen display in response to changes in phone orientation, while also supporting advanced motion-based game play.

Our first goal was to use the accelerometer to perform activity recognition — to identify the physical activity, such as walking, that a smartphone user is performing. We figured that this ability could then be used as the basis for many health and fitness applications, and could also be used to make the smartphone more context-sensitive, so that its behavior would take into account what the user is doing. The phone could then, for example, automatically send phone calls to voice mail if the user was jogging.

We used existing classification algorithms to identify activities, such as walking, and help map accelerometer data to those activities. These algorithms, or methods, learn from specific examples. When given data about U.S. football players and non-football players, such an algorithm might learn that football players tend to weigh over 200 lbs. In our case we provide the algorithm with acceleration data that is labeled with the associated activity, and from this data the algorithm automatically generates rules for identifying the activities. Since these rules can be implemented in software, the activity recognition process can be automated.

The activities that our system can recognize include walking, jogging, climbing stairs, sitting, standing, and lying down. We collect a small amount of labeled “training” data from a panel of volunteers for each of these activities, with the expectation that the model that we generate will be applicable to other users. The only assumption that we make is that the user’s phone is running our app in the background and that the phone is in their pocket.

Initially, we could identify the six activities listed above with about 75 percent accuracy. These results are adequate for obtaining a general picture of how much time a person spends on each activity daily, but are far from ideal. However, if we can obtain even a very small amount of data that a user actively labels as being connected with a particular activity, we can then build a personal model for that user, with accuracy in the 98-99 percent range. This shows that people move differently and that these differences are important when identifying activities.

We call our system Actitracker. If you download our Android app , it will allow you to review reports of your activities via a web-based user interface. This will allow you to determine how active or — perhaps more to the point — how inactive you are. We suspect that these reports may serve as a wakeup call to some and hope it will lead to positive changes in behavior. Such a tool could also be used by a parent to monitor the activities of their child, and thus could even help combat conditions such as childhood obesity.

We are also studying what other things we can learn about a user from their accelerometer data. Currently, using this data we can predict a user’s gender with 71 percent accuracy, and can distinguish between “tall” and “short” people and “heavy” and “light” people, each with about 80 percent accuracy.

We have also established that one’s gait, as measured by a smartphone accelerometer, is distinctive enough to be used for identification purposes. From a pool of several hundred smartphone users, we can identify any individual with 100 percent accuracy if we have a previous data sample. Soon, we may be able to use accelerometer data to help diagnose gait problems. This application is important since gait problems are often indicators of other health problems. All of these applications are based on the same underlying methods of classification as our activity recognition work.

This category of applications is part of a growing trend towards mobile health. As new sensors become available and as existing sensors are improved, even more powerful smartphone-based health applications should appear. For example, other researchers are boosting the magnification of smartphone cameras so that they can analyze blood and skin samples. Researchers at MIT’s Mobile Experience Lab are even developing a sensor that attaches to clothing, which will allow smartphones to track their users’ exposure to ultraviolet radiation and the potential for sunburn.

Smartphone sensor technology, especially when combined with data mining, offers tremendous opportunities for new and innovative applications. We are committed to exploring these applications and expect that there will be a flood of new sensor-based apps over the next decade. While many of these apps may just be curiosities, we suspect that some will “stick” and provide tangible benefits to individuals and society.


Contacts and sources:
By Gary M. Weiss, Inside Science Minds Guest Columnist
Inside Science Minds

Report abuse

Comments

Your Comments
Question   Razz  Sad   Evil  Exclaim  Smile  Redface  Biggrin  Surprised  Eek   Confused   Cool  LOL   Mad   Twisted  Rolleyes   Wink  Idea  Arrow  Neutral  Cry   Mr. Green

Top Stories
Recent Stories

Register

Newsletter

Email this story
Email this story

If you really want to ban this commenter, please write down the reason:

If you really want to disable all recommended stories, click on OK button. After that, you will be redirect to your options page.