Moderators: Elvis, DrVolin, Jeff
For instance, they became intrigued by "tool-like objects oriented at 30 degrees," including spatulas and needle-nose pliers.
barracuda » Sat Jul 20, 2013 10:53 pm wrote:For instance, they became intrigued by "tool-like objects oriented at 30 degrees," including spatulas and needle-nose pliers.
Holy crap, they're looking at porn.
Microsoft Kinect used to live-translate Chinese sign language into text
Researchers from the Chinese Academy of Sciences (CAS) have used a Microsoft Kinect to live-translate Chinese sign language into text.
The work, a collaboration between the CAS Institute of Computing Technology and Microsoft Research Asia, could be vital to helping deaf and non-deaf people communicate with each other.
justdrew » 10 Jul 2013 00:20 wrote:http://en.wikipedia.org/wiki/Schema_theory
http://en.wikipedia.org/wiki/Social_cognition
Note that the 1st video I posted the fellow states that they have Voice Recognition (speech to text) working at nearly 100%. (7 years ago)
If you have a DB of phone calls and know the participants, you can develop a training file for each person's voice. So you'd be able to have the software convert speech to text of phone calls with a high reliability and even recognize voices otherwise unidentified (say, someone using a payphone).
So what?
Well, one thing this would let you do is develop of map of each individual's cognitive schema and their particular associational map.
Which would enable you to (quite possibly enable your software to) write a flexible script that could be used to prime, actively exploit the mere-exposure effect, evoke desired schema associations, and generally lead the conversant wherever you want them to go... with some degree of likelihood. A salesman's dream come true.
but more than salesmen would find it handy.
very potent personalized push-polling for instance.
Certainly the existence of the mere-exposure effect dictates that the content of TV and movies MUST be regulated.
How much of that regulation is Controlled/Dictated vs arising naturally from "market forces" is an open question. Certainly the Market Forces are also regulated and this is likely the primary point for control, you know, Nielsen ratings and all that crap.
Another wonderful thing to look for is if/when the thinking-machines (not "free" thinking, these are just, for now, software doing 'cognitive computations') with access to "the personal data system" (that network of loci that store info about, or submitted by, individuals), becomes increasingly able to PROVIDE DIRECT STIMULUS to individuals.
This will certainly initially take the form of individually customized "coupons" sent to people. Does Joe Blow need a little reward? His phone beeps and he's got a coupon for a free {Favorite Food Item}. In time the available array of stimuli that can be applied will expand and differentiate. So it'll be possible to do positive conditioning as well as the current negative (punishment for law breaking). Though the negative conditioning will no doubt expand as well. If you're speeding, expect your bank account to automatically be dinged. If intoxication is detected, expect the car to pull over and park itself, locking you in while police are en-route.
I predict, the Next-Facebook/Next-Google will be a company that is able to effectively push such positive stimuli to consumer attention.
In fact I'd be a damn fool to not start it up right now and get bought out ASAP. I may indeed be a damn fool though. Who wants to go at it with me? With luck we could fuck it up in such a way that it's delayed for a generation
Any bogus Patent Experts in the house?
http://en.wikipedia.org/wiki/Felicific_calculus
and another thing...
given access to the "the personal data system" (internet posts, phone conversations, text msgs, etc) a significant amount of automated psychological analysis and categorization could be done, based on Individuals responses to Characters and Events in various Fictional Universes, particularly the ones with an almost "one of each type" cast of characters, like Harry Potter, Game of Thrones, Reality-TV programs, etc. Using concepts from projective psychotherapy.
"The Machines of Loving Grace will know you better than you know yourself"
A Japanese roboticist Dr. Hiroshi Ishiguro is building androids to understand humans. One is an android version of a middle-aged family man — himself.
Photo gallery of his androids at: http://www.geminoid.jp/en/robots.html.
The robot, like the original, has a thin frame, a large head, furrowed brows, and piercing eyes that, as one observer put it, seem on the verge of emitting laser beams. The android is fixed in a sitting posture, so it can’t walk out of the lab and go fetch groceries. But it does a fine job of what it’s intended to do: mimic a person.
Ishiguro controls this robot remotely, through his computer, using a microphone to capture his voice and a camera to track his face and head movements. When Ishiguro speaks, the android reproduces his intonations; when Ishiguro tilts his head, the android follows suit. The mechanical Ishiguro also blinks, twitches, and appears to be breathing, although some human behaviors are deliberately suppressed. In particular, when Ishiguro lights up a cigarette, the android abstains.
These robots have been covered many times by major media, such as Discovery channel, NHK, and BBC. Received Best Humanoid Award 4 times in RoboCup. In 2007, Synectics Survey of Contemporary Genius 2007 has selected him as one of the top 100 geniuses alive in the world today.
The idea of connecting a person’s brain so intimately with a remotely controlled body seems straight out of science fiction. In The Matrix, humans control virtual selves. In Avatar, the controlled bodies are alien-human hybrids. In the recent Bruce Willis movie Surrogates, people control robot proxies sent into the world in their places. Attentive viewers will notice that Ishiguro and the Geminoid have cameo roles, appearing in a TV news report on the rapid progress of ”robotic surrogacy.”
Ishiguro’s surrogate doesn’t have sensing and actuation capabilities as sophisticated as those in the movie. But even this relatively simple android is giving Ishiguro great insight into how our brains work when we come face to face with a machine that looks like a person. He’s also investigating, with assistance from cognitive scientists, how the operator’s brain behaves. Teleoperating the android can be so immersive that strange things happen. Simply touching the android is enough to trigger a physical sensation in him, Ishiguro says, almost as though he were inhabiting the robot’s body.
Join us at the GF2045 International Congress to meet Dr. Ishiguro, see his famous geminoid, and learn more about new and amazing technologies in life extension, robotics, prosthetics and brain function from the world's leading scientists.
Elfoid is a cellphone-type teleoperated android that follows the concept of Telenoid. Minimal design of human and soft, pleasant-to-the-touch exterior are implemented in cellular phone size. Thanks to its capability of cellular phone, everyone can easily talk with a person in the remote place while feeling as if they are facing with each other.
Hugvie is a a "human presence" transfer media that enables users to strongly feel the presence of remote partners while interacting with them. Through research and development of other robots such as "TelenoidR R1" (press release August 2010) or "ElfoidR P1" (press release March 2011), we have found that hugging and holding these robots during an interaction is an effective way for strongly feeling the existence of a partner. "Hugvie" is an epoch-making communication medium that can strongly transfer the presence of an interaction partner despite its simple shape.
Google to buy Nest Labs for $3.2bn
BBC News 13 January 2014
<snip> It produces a thermostat capable of learning user behaviour and working out whether a building is occupied or not, using temperature, humidity, activity and light sensors.
<snip> Google's purchase of Nest Labs follows its acquisition of military robot-maker Boston Dynamics last month and of human-gesture recognition start-up Flutter in October.
Google’s Schaft Robot Dominates Pentagon Contest
Washington Wire Dec 23, 2013
<snip> Google Incs newly acquired Japanese start-up is poised to secure more Pentagon funding to develop a creation capable of venturing into dangerous disaster zones to help humans. Yeeeaahh...riiiiight.
http://blogs.wsj.com/washwire/2013/12/2 ... n-contest/
Google just bought a high-tech face recognition unit called Pitt Patt
http://www.fastcompany.com/1768963/how- ... ebs-future
Google Acquires Seven Robot Companies, Wants Big Role in Robotics
IEEE spectrum 4 Dec 2013
http://spectrum.ieee.org/automaton/robo ... -companies
Users browsing this forum: No registered users and 166 guests