Big data research proves rather useful in simply being able to look upon and analyze more than is humanly possible.
For example, it would take a person an entire lifetime to view a week’s worth of uploaded YouTube videos. Assuming the optimistic rate of two pictures a second, it would take a person four years to view every single photo uploaded to Facebook on a given day.
Yet these videos and pictures can be valuable tools in helping to identify people. More specifically, they can develop what researchers Karl Ricanek Jr. of UNC Wilmington and Chris Boehnen of Oak Ridge National Laboratory call “facial analytics.” Facial analytics rely upon soft biometrics, or metrics that belie a person’s age, gender, and other personal attributes based on a simple photograph without revealing their identity.
“A facial analysis system,” explains the September report penned by Rcianek and Boehnen, “explicitly divorces the recognition component from attribute generation: it doesn’t attempt to identify individuals or confirm their identity but instead generates descriptive metadata about them based on their face.”
The idea is that, possibly a la Minority Report, companies could determine certain characteristics such as age, gender, and even personality, and market toward that demographic. Today, however, the technology is being used for a greater cause: helping police departments retrieve illicit images on a suspect’s computer.
Artemis, a facial analytics system developed in a partnership among Oak Ridge National Laboratory, the UNC Wilmington, and the Knoxville, Tennessee Police Department, goes through each video and image file in a given computer and analyzes the content for possible illegality based on a number of tools, including hashing and facial analytics.
“On average, it takes a forensic examiner two to five days to conduct an exhaustive examination of a standard home computer for child pornography and generate a report on its contents. Artemis performs a ‘triage’ scan in minutes that outputs a forensic report, tagging possibly illegal videos or images on the suspect computer or memory device.”
Artemis utilizes image hashes in picking out specific images. According to the report, “Hashing functions map large data such as an image file to a fixed-length value or code that can be used for comparisons.” In this way, if an investigator knows what hash codes to look for, he or she can find the incriminating images relatively easily.
However, hashes cannot always be trusted, as people who are computer-literate can easily change their image’s hash codes by altering the photo in minor ways, such as red-eye adjustment. This is where the soft biometrics, in particular age determination, comes in.
Artemis partly relies on the ability to discern a person’s age, a topic that is thoroughly discussed in a 2011 UNCW-published paper by Ricanek and Benjamin Barbour.
Much like humans, an automated system’s ability to judge a person’s age based on the amount of faces upon which they have looked and knowing the ages of those faces. Unlike humans, however, computers are not subject to race and age biases. Further, if a human were asked the question, “How do you know she’s that young?” he would likely respond with an invented answer, the gist of which boils down to “I don’t know.” On the other hand, the database can pick out specific facial features that give away certain age hints.
For example, Ricanek oversaw experiments at UNCW which built a machine learning system on top of the FG-NET Aging database.
The results have been promising enough to incorporate into Artemis. According to the 2011 study, “the performance of these systems has improved dramatically during the past decade: for queries evaluated on the FG-NET database, for example, the mean absolute error between true age and estimated age has tumbled from nearly 10 years to under 4 years.”
NEXT -- Faces Among the Faceless >
These tests were run by giving the automatic age detection system a “training database” where it was given a set of 1600 photos. The ages of the first few hundred are given before the system relies on machine learning to determine the rest. When the system is then let loose upon unknown images, it can judge people 21-69 years of age within 5.8 years on average, all people within 4.37 years on average, and, most importantly for Artemis, those under the age of 20 within 1.8 years.
While not as important to Artemis’s goals, automated gender detection has also come a long way. In a study conducted at UNC Wilmington, the system went up against 278 humans in determining the gender of eight small children. Small children are difficult to judge if contextual clues are misleading or outright missing. For the most part (91% of subjects), people could only identify half (no better than random guessing) while the system correctly identified 67.5 percent of the images. When expanded to include all ages, the system working on the FG-NET database correctly identified 78.1 of those 10 and younger, 92.1 percent of those 19-55, and 88.9 percent of those 56 and older.
According to the 2012 report, Artemis is freely available to law enforcement agencies and is currently in use by over 100 such departments. However, today’s facial analytics is not just for law enforcement. India is incorporating facial recognition technology in their ambitious national identification survey. The facial analytics will help the Indian government ensure they do not double-count some of their billion-plus citizens among other things.
Of course, when the issue of facial recognition is raised, some security and privacy concerns have to be addressed. For example, Google was planning on implementing technology from partner Pittsburgh Pattern Recognition in a significant fashion but ended up limiting it to personal uses such as photo-unlocking one’s smart phone.
The idea is that these technologies estimate soft biometric details such as age and gender without delving too far into a person’s identity. Hypothetically, companies could use the information gleaned from self-taken pictures to send very specific advertisements to people.
With that said, Facebook and Google+ have implemented technologies which pick out individuals to tag in photos. While that specific implementation is mostly harmless, it is not difficult to see how these technologies could devolve into unwanted surveillance. Minority Report was a sociologically unpleasant movie not only because of its dystopic judicial system but also because of the invasive nature of identity detection and the extensive methods required to escape it.