I’ve recently been looking at low barrier to entry means of creating VR and AR experiences, particularly aimed at kids. A friend mentioned aframe.io to me over coffee and I thought that I would give it a looksee.
aframe lets you create WebVR scenes in an easy-to-understand HTML-like markup language.
I followed a lot of the tutorials available on their site (the documentation really is incredible), and started creating basic shapes and interactions. I bought a Google Cardboard VR set for about £5 and a controller for about £7 off of Amazon.
First attempts were very basic and while I was very taken with the platform, I was stuck trying to think up a fun application for the technology. But then inspiration struck.
Once I started playing with aframe.io I was blown away. My native background is in web development, so using it as a markup language was a no-brainer to me. Making the shapes (or primitives) was easy, and I could apply sizes, animations, particle effects etc. I haven’t thought too much about the interactivity from the end user’s perspective yet. Being in VR changes things up and there’s so much to take into account. I stifle a giggle thinking about users randomly bumping around into each other like I did in my kitchen the first time I tried it out with my Cardboard headset but also realise that this is the wrong intent. I’d like to have the user in the center and only need them to turn around in a 360 degree circle to interact with the planets.
What I like about this so far is the instant feedback of refreshing a web page to see the results, as well as the low footprint of the technology used to leverage Web VR. I carry out the majority of coding on my Surface Pro 4, but one night I was lying awake thinking about a planet texture and modified the code from my iPad Pro.
On my Surface or Mac I use atom as my code editor of choice – someone’s already built a package for it to support syntax checking so it’s really easy to code in. On the iPad I check into my remote site and use the built-in editor in Panic’s Transmit.
I’ve only spent about five hours on this so far, perhaps a little more. Wyatt’s been hanging out on my lap during some coding sessions and giving me instructions on what to do next (animating Venus is big on the list). It’s been a great way for me to learn a new method of coding and for him to learn about the planets a bit more. When I think about how many other available 3D models are out there for free download (like dinosaurs) it strikes me that it’ll be fun to link projects like this to what he’s learning about in his playgroup at the moment – so I’m quite looking forward to that from a parenting perspective.
Things to Note
- A the moment, the load times are a little insane. I think this is primarily down to using texture maps not optimised for the web experience.
- I haven’t found a way for entities to inherit the animation instructions yet. There must be a more efficient means of doing this so that the code isn’t repeated for each entity.
- I’m using an external library to render an animated GIF for the Jupiter texture. This is a particular point of pride. Jupiter shows a motion map as well as it’s rotation giving it a more realistic appearance.
- Would like to find animated weather maps for all of the other planets (esp Earth)
- Can’t get a controller to work yet. WASD works on PC and Mac keyboards, but you can only use a particular ‘direction’ once on mobile devices (iPad and iPhone) before you’re stuck moving permanently in that direction and are forced to reload.
One thing I haven’t done yet is touched in with their Slack channel to ask questions regarding the above. I’m quite happy so far to push on and discover through mistakes.
You can view the progress of this project by visiting the Pahpayguh.