A University of Washington team has developed an artificial intelligence system that lets a user wearing headphones look at a person speaking for three to five seconds and then hear just the enrolled speaker’s voice in real time even as the listener moves around in noisy places and no longer faces the speaker. Pictured is a prototype of the headphone system: binaural microphones attached to off-the-shelf noise canceling headphones. Credit: Kiyomi Taguchi/University of Washington Noise-canceling headphones have gotten very good at creating an auditory blank slate. But allowing certain sounds Read More
No comments:
Post a Comment