Sunday, February 3, 2013

No less than 12 years I've waited for Meta View...

...or at least the concept they are bringing to life. I have so much to say that I'm too overwhelmed to write a long post so I'll keep this simple. If you have heard my rants about the future of computing interfaces (Augmented Reality, aka NO MORE MONITORS), then you have heard me describe Meta View though until now, nobody had brought all the technology together with the same intent which I had. It's finally happened! Go check them out and if you know me and my ideas, PLEASE comment here to encourage them to get me setup to develop my idea (the software side) using their hardware. I don't know if it's just a matter of selection (like the Leap Motion I got), cost, or what so any encouragement is good.

My message to Meta View:

I began working with "VR" as a novice developer circa 1994 on a Zenith Data Systems 8086 with CGA Amber Monochrome screen where I was writing my own graphics libraries in assembly language and consuming them in qbasic (though I started coding in gwbasic) to build a virtual bob-sledding simulator. Next I incorporated a mouse and built my own 3D asset system to import, parse, visualize and manipulate wireframe objects. That "3D viewsystem" was also my own doing with the help of nothing both the built-in documentation for syntax/methods and a college algebra book (when I was in middle school) to learn the needed trig from. Since those days I have been tracking the progress of all things VR/AR and as far back as 12 years ago I have been preaching the future (more like the current now) of computing interfaces. My vision was simple. A transparent, stereoscopic HMD integrated with head/location tracking that would allow me to overlay (the term Augmented Reality wasn't one I'd heard of back then) my computing environment onto my real world instead of fitting it onto a monitor. Now I sit with 8.7 million pixels in front of me (one each of 1080p, 1200p, and 1600p monitors) and STILL feel like I'm trapped. I wanted to put application elements (windows, tickers, videos, etc..) where they made sense in my environment. Now I'm developing for Leap Motion (I have a gen 6 unit) and next month my Occulus Rift shows up. The rift is a stop-gap solution that I intend to hack by adding cameras that I can redirect to create the illusion which meta looks to be tackling head on (with transparent AMOLED I would guess). My goal with these products is to create a gesture-based, window management solution (ideally for DWM in Windows, but most likely as a plugin for Compiz-Fusion) that will allow me to get rid of my monitors for primary usage.

Meta-view would take all the hackery out of the equation for a proof of concept (I haven't seen any specs yet about resolution or pixel density so I don't know if the product will be appropriate for general computing) and let me focus on the end goal of the software interface (gestures, window management and the likes). All these years I've not had the resources to patent my ideas, build prototypes and see it through and at the same time I've been skeptical that I would succeed in that type of business field (though I'm very self motivated and confident otherwise). You would be hard-pressed to find anyone outside of your company and myself that is MORE enthusiastic about the vision we share where the future of computing interfaces is concerned.

I will be entirely honest and say that my professional experience with project management, architecture and development is not strong with the exact tools I'll need for this project but at the same time, i'm experienced enough to already know WHAT I need (hooking DWM or building for compiz, gesture recognition, etc..) and I imagine you'll be providing an API and SDK considering the sensors I see on the unit. I have been a professional developer for 12 years now thanks to a quick-start in the US Marine Corps (and easily 7 as an enthusiast before that). I have a Bachelors of Science in Information Systems Security and on any given day am either building software, building/maintaining databases, managing our IT assets (networking, desktops, etc..) for my current venture or hacking/tinkering with the latest of tech. I've been to Google I/O twice and plan on going again this year, own (the first major Android hacking site aside from XDA-Developers though we've since gone quite) and ultimately I hope to see this tech be incorporated well enough with a mobile device to see TRUE mobile, AR be a reality. I think that we have all the technology needed now. We just need companies like yours to provide the hardware and people like me to provide the ideas/software.

Please seriously consider me for early access to a developer unit so that I can begin working with the best solution for my idea as soon as possible instead of being forced to make due with other solutions not quite ideal for it.

As a developer myself, I'm fully familiar with NDAs and beta testing and can assure you that I'm a good candidate even aside from my ideas.

Sincerely and enthusiastically,

Michael Richardson