Speaking of visual display systems and sci-fi …

Alex: the problem I see with lasers (I haven’t read your entire exchange) is that if you move your eyes, the projected image will not move with your line of site. Unless they got crazy stuff that will conpensate for that. Then there’s the issue of the focal length of the eye changing as you look at different things. This assumes that the laser is doing an overlay, and not meant to block out everything completely.. and if that were the case, you’d have to worry about your peripheral vision.
Alex: I’ll go with the goggle optics until I can get wet-wired

Skennedy: Yeah, I’m of that mind myself. Though I don’t know if wet-ware works will ever actually take off. I think we’ll see transmission to the optical nerve or similar before that, perhaps, via a ‘ballcap’.

Alex: transmission to a receiver on the optical nerve would be just fine by me. But I’ll wait until v2.3 before I commit
Alex: or, wire it in to your brain so you see a “window” in your mind with the info in it

Skennedy: I just think copper (and similar) isn’t going to cut it. We can calibrate lasers so they apply their heat at a specific precise distance under your skin. I wonder if we’ll find some way to provide electrical stimulation to the optical nerves without directly physically accessing those nerves.

Alex: precision growth of new nerves that will act as the go-between between the grey matter, and the hardware.
Alex: little organisms that’ll be grown using your DNA, but built to perform the function of interacting with the hardware
Alex: *Shrugs*

Skennedy: Ooh. I’m so posting this.

I mean, seriously. Why are we using copper to transmit signals through the body (and yes we are doing that) when we should be building a tertiary nervous structure using our own genetic material? I mean, we should use our own body’s natural method of transmitting electrical impulses.

Alex and I talked a bit more, and he said that stylistically he’d want that to terminate behind the ear. Which is where people with cochlear implants have their terminal.

Speaking of which, wouldn’t cochlear implants benefit from using a grown nerve to transmit electrical impulses?

All that cyberpunk wetware crap? Think of it this way: If we could grow new sets of nerves to allow external devices to interact with our senses in new ways, why would we concern ourselves with installing hardware into our bodies? We’d have built a naturally infection-free direct transmission method allowing us to change our augmenting hardware as easily as we change hats.

I almost want to start writing a novel just so I can explore this.

~ by Skennedy on February 28, 2007.

38 Responses to “Speaking of visual display systems and sci-fi …”

  1. Alex is also batshit crazy. Why the hell would he want to do that. :-P I mean, everyone knows it would be cooler to have a socket in the back of the head, right where that little bump is. :-P

  2. Alex is also batshit crazy. Why the hell would he want to do that. :-P I mean, everyone knows it would be cooler to have a socket in the back of the head, right where that little bump is. :-P

  3. Cyberware is so 1992. Bioware is where it’s at.

  4. Cyberware is so 1992. Bioware is where it’s at.

  5. Now I just want to go and play Shadowrun.

  6. Now I just want to go and play Shadowrun.

  7. In terms of current display technologies, you might be interested in this:
    http://www.hitl.washington.edu/projects/vrd/

    It’s a wearable display that uses a laser and MEMS mirror array to raster an image onto your retina.

    Sure, it’s still a display rather than augmenting sight directly, but it’s rather nifty tech – by adjusting the focal length lever I could get a clear picture out of my bad eye, so the display came out clearer than real life did. Current models are only red at 800×600 resolution, but they’re hoping to get a three-colour version working in future, as well as reducing the form factor (though it’s already quite light).

  8. In terms of current display technologies, you might be interested in this:
    http://www.hitl.washington.edu/projects/vrd/

    It’s a wearable display that uses a laser and MEMS mirror array to raster an image onto your retina.

    Sure, it’s still a display rather than augmenting sight directly, but it’s rather nifty tech – by adjusting the focal length lever I could get a clear picture out of my bad eye, so the display came out clearer than real life did. Current models are only red at 800×600 resolution, but they’re hoping to get a three-colour version working in future, as well as reducing the form factor (though it’s already quite light).

  9. Eye trackers have existed for years. I got to play with one when I was touring around the National Center for SuperComputing Applications in Champaign-Urbana, IL. It’s really no big deal to look for the black dot in the midst of all the color and white and beam the image in there.

    • Exactly. That’s the next step in the laser display system: Track the eye and redirect the beam into the pupil, wherever it goes. Eye tracking is already well understood, and precision laser beam aiming is trivial these days. The hard part, as I understand it, is getting everything in sync in real time. If you look suddenly to the left, the eye tracker, laser, and the spatially correct image all have to change as fast as the eye can move –for both eyes! That’s no big deal for watching movies or canned content, but it’s a big problem for playing video games and other interactive applications.

      • Speaking as a physicist and a former laser jock, I’m not exactly comfortable with this idea of directing lasers into the eye. I’ve been flashed by diode and HeNe lasers multiple times. Even at 1mW it’s not a fun experience. Laser light is coherent, and thus not like the light you and I are used to seeing on a daily basis. It really wouldn’t be a good choice for this application.

        Also, you guys need to do what it takes to experience the CAVE immersive VR system. It’s like the holodeck on Star Trek, only smaller. The one I played in was, again, at NCSA, but UM-Ann Arbor also has one.

        • Oh yeah? I live fairly close, I wonder who I’d have to talk to about that.

          • Well, I don’t know exactly, but if you’re interested, you should definitely come to the 2600 meeting on Friday night. It’s at the Starbucks on South University, look for the group of us with laptops and other weird gear. I’ll be wearing a MARPAT boonie hat.

          • I know where that’s at. Next time, though – must see the girl!

          • Understood. It’s always there, the evening of the First Friday of the month.

          • Understood. It’s always there, the evening of the First Friday of the month.

          • I know where that’s at. Next time, though – must see the girl!

          • Well, I don’t know exactly, but if you’re interested, you should definitely come to the 2600 meeting on Friday night. It’s at the Starbucks on South University, look for the group of us with laptops and other weird gear. I’ll be wearing a MARPAT boonie hat.

        • Oh yeah? I live fairly close, I wonder who I’d have to talk to about that.

      • Speaking as a physicist and a former laser jock, I’m not exactly comfortable with this idea of directing lasers into the eye. I’ve been flashed by diode and HeNe lasers multiple times. Even at 1mW it’s not a fun experience. Laser light is coherent, and thus not like the light you and I are used to seeing on a daily basis. It really wouldn’t be a good choice for this application.

        Also, you guys need to do what it takes to experience the CAVE immersive VR system. It’s like the holodeck on Star Trek, only smaller. The one I played in was, again, at NCSA, but UM-Ann Arbor also has one.

    • Exactly. That’s the next step in the laser display system: Track the eye and redirect the beam into the pupil, wherever it goes. Eye tracking is already well understood, and precision laser beam aiming is trivial these days. The hard part, as I understand it, is getting everything in sync in real time. If you look suddenly to the left, the eye tracker, laser, and the spatially correct image all have to change as fast as the eye can move –for both eyes! That’s no big deal for watching movies or canned content, but it’s a big problem for playing video games and other interactive applications.

    • Can you explain what you mean? An eye tracker, if I’m not mistaken, simply detects where you are looking – it doesn’t actually transmit new technical information through the eye, nor actually detect precisely what you see (as opposed to just detecting how the eye is oriented).

      • Guess there’s more than one kind of eye tracking. There are already eye trackers that literally track the position of the pupil in the socket. Very useful if you’re trying to aim a laser into them.

        • No, that’s what I meant. his message didn’t make sense as a response to my post, which was mostly about wetware implants and sending signals down nerve endings. They’ve had complex camera systems tracking eye movements for a long time, I just didn’t see how it related.

        • No, that’s what I meant. his message didn’t make sense as a response to my post, which was mostly about wetware implants and sending signals down nerve endings. They’ve had complex camera systems tracking eye movements for a long time, I just didn’t see how it related.

      • Guess there’s more than one kind of eye tracking. There are already eye trackers that literally track the position of the pupil in the socket. Very useful if you’re trying to aim a laser into them.

      • It’s tracking where you are looking by using machine vision to track your pupils. If you know where the pupil is, you can use that information to construct an image the eye can see in various ways.

      • It’s tracking where you are looking by using machine vision to track your pupils. If you know where the pupil is, you can use that information to construct an image the eye can see in various ways.

    • Can you explain what you mean? An eye tracker, if I’m not mistaken, simply detects where you are looking – it doesn’t actually transmit new technical information through the eye, nor actually detect precisely what you see (as opposed to just detecting how the eye is oriented).

  10. Eye trackers have existed for years. I got to play with one when I was touring around the National Center for SuperComputing Applications in Champaign-Urbana, IL. It’s really no big deal to look for the black dot in the midst of all the color and white and beam the image in there.

  11. Yet another contender

    The HEADPLAY Personal Cinema System
    http://gizmodo.com/gadgets/home-entertainment/headway-personal-cinema-system-apparently-the-visor-is-comfy-240641.php

    • Re: Yet another contender

      Native resolution 800×600, supporting 1024×768 – that’s not bad.

      Unrelated to the quality, can I just say I hate marketing material where they talk about speeds in bits AND bytes in the same paragraph?

    • Re: Yet another contender

      Native resolution 800×600, supporting 1024×768 – that’s not bad.

      Unrelated to the quality, can I just say I hate marketing material where they talk about speeds in bits AND bytes in the same paragraph?

  12. Yet another contender

    The HEADPLAY Personal Cinema System
    http://gizmodo.com/gadgets/home-entertainment/headway-personal-cinema-system-apparently-the-visor-is-comfy-240641.php

Comments are closed.