Good morning and thanks for popping in to our cozy little corner of the blogosphere, it’s week two of the Ultimate Coder: Going Perceptual Challenge and I’m here to give you the low-down on what we’ve been up to!
I’ve just made a warm pot of tea and have managed to find a single perfectly formed biscuit in the tin (Rob’s brutish foraging has left most of them in pieces, what a crime!) so I’ve got what I need to get comfortable and reveal all…
If you missed us last week, we introduced ourselves and talked a little bit about what we will be putting together to harness the raw power and unique abilities of the Intel’s Perceptual kit. I won’t bore you with a re-hash of that post but essentially we’re crafting a virtual experience that will transport you into a potter’s studio where you’ll be able to sculpt works of art with your own hands! Have a quick read via this link to get up to speed.
It’s been a bit of a hectic week for us as we’ve been putting in extra hours on our other commitments to clear our schedule for the next six weeks so that we can commit fully to making the most of our time in the challenge.
Let’s get started!
So, we’ve been moving forward with the final bits of R&D before diving into the code-proper and are currently looking into how clay works. Clay, you see, is strange stuff. It’s a uniquely tactile substance that we as a species have harnessed to create works of art and utility for an unfathomably long time, we know it well.
That presents quite the challenge for us to recreate it digitally (especially when our goal is complete immersion). The user has to feel like they’re touching it when they’re not. And we have six weeks. Gasp!
But hey, this IS a challenge, why would we pick something easy to do?
The work ahead lies not only in simulating the physical properties of clay, like the thickness, texture and yield, but balancing that with what a user typically expects when there is no yield or resistance in the air. This challenge throws up several paths of R&D, and human expectation is one of those paths.
So far we’ve set up a bunch of raycasts for each finger, this gives us a projection of WHERE the finger might be heading and allows us to better judge what a user might be expecting as well as where the user’s hand is in 3D space. There’s no 1:1 match here, so we have to adjust it and tweak it until it perceptually matches instead of technically matches.
Perception versus reality continued…
I feel that quite often, designers, artists and programmers alike strive for 100% faithfulness to what they’re trying to recreate virtually, surely that’s the goal right? Well I’m not convinced that it is…
A great concept artist will sometimes bend the rules of perspective or light and shadow for impact. Perhaps they want to lead the eye or play with scale for effect. The same can be said for the programmer; perhaps you’re making a racing game, you could go all out and simulate realistic traction or you could make something that feels right, or even better, fun!
It’s a fine line, you do need to have a very strong understanding of the technically-correct if you want to make something that is stylised for effect, I feel that Rob has a knack for physics so we’ll see how he plays with the balance of realism and fun as well as the difficult task of also making it a bit of a challenge, because we want it to be something you come back to and hone your skills with.
I’ve been thinking a lot about the setting of our virtual world as well as the overall colour palette and general ambience. One of my goals is to transport the user into another universe. One where for one, you don’t end up with dried bits of clay under your fingernails for days, but more importantly a place that is beautiful and relaxing.
I’ve always been fond of the kinds of visuals you see in places like Marrakech – the almost heavenly terracottas and oranges contrasting against sensual magentas and reds all set against deep blue skies. I want to almost smell the spices. I’m not entirely sure if I can capture that, especially considering I am limited to the peripheral areas around the pot – perhaps 25% of the screen area but I’m going to give it my best shot.
The other 80% will be the clay itself which I’m looking forward to doing, specular highlights can be hard to get right, it’s easy to make something very plastic-y. Another thing to consider will be the fact that the clay will be spinning at a reasonably fast rate, so details might get lost to the eye.
It’s something to think about at least for now while I put together the placeholder models and textures, it could all change and I might find that the visuals I want look quite noisy or detract too much from the experience, and I might have to scale it back to a minimalistic sort of look but we’ll have to see!
That’s all for this week, there wasn’t much by way of programmer talk here but we’ll be getting to that soon. We should be well into our first proper build by this time next week, so rest assured that on Monday we’ll be going deep into the mind of our mad scientist, foraging ape and programmer, Rob.
Feel free to send us your thoughts via Twitter or Facebook – you can grab me @ChimpSquared, Rob @SquaredApe and our official company feed @SimianSquared and http://www.facebook.com/simiansquared
Hello all and welcome to our very first Ultimate Coder: Going Perceptual blogpost!
I thought we’d use this opportunity to introduce ourselves and tell you about what we will be doing with the Perceptual Computing platform and where we’re coming from with our ideas:
Who are Simian Squared?
We are an independent developer and are based in London, England. You may recognise us as the developers of the upcoming platformer The Other Brothers and a Physynth, a musical experience for iPad (which we’ll be telling you a little more about later on in this post)
What is Perceptual Computing?
Intel have provided us with a lovely piece of kit called the Interactive Gesture Camera, a stunningly powerful Ultrabook portable computer as well as their Perceptual Computing (PerC) SDK. In a nutshell, the PerC it allows us to capture and interpret physical gestures or movements and use them to immerse the user in our interactive experience.
Why did we decide to jump aboard the challenge?
That excitement of being able to work with something powerful and unknown is something that appeals to us and is the main reason behind us deciding to take a project like this on. We see Perceptual Computing as an opportunity to create a new kind of immersive experience, one that is almost tactile and that is very exciting to us.
We should probably start by talking to you a little bit about what we’ve done before in this area, and how it will translate into this project:
Back at the end of 2011 we took some time out of working on games (in the traditional sense of the word) to work on an experimental idea we had been bouncing around for a while.
So what was it?
Like with the Ultrabooks and Perceptual Computing today, Apple at the time had presented us with an exciting new canvas to work on and made developers lives easier with great hardware and support.
We had a large-screened device with a number of sensors at our disposal and a blank canvas to come up with something unique. Having this, we took the opportunity to wander away from our comfort-zone of games and proposed an idea we had for a new kind of musical instrument – one that took from my experience as a games developer to result in tweakable physics simulators to trigger their sounds with.
So how does that relate to this?
We looked at what other music app developers had been doing with their user interfaces and saw that for the most part everything was quite utilitarian, which is fine, but there was nothing really there that created an experience, nothing that pulled the user in and took them away from reality for a while.
We had been thinking that a traditional interface really wouldn’t cut it – this idea conjured up visions of old 1960s hardware, of finding an ancient bit of kit in your dad’s garage and firing it up it’s half-working displays and temperamental sound units with wonder and little sprinkling of intrigue.
After some brainstorming, we came up with a number of ways that we could immerse the player:
1. We decided that we would emulate physical hardware in realtime 3D. No pop-ups or drop downs allowed that could break the immersion, and it needed to be beautiful. High resolution, realistic textures and a fully three dimensional interface.
2. Following on from that, we came up with the idea to create a sort of pseudo-3D depth effect by tilting the camera to match the angle of tilt on that the actual physical device is held at. This gave the illusion of it being an actual, physical piece of hardware.
3. In conjunction with this, we lit the virtual hardware and wrote custom shaders which reacted with the lights in combination with the physical angle of the iPad to create a realistic surface that shone as you moved your iPad.
4. Finally, being a rather complex app, we wanted to create a manual whilst adhering to our rule of emulating a physical experience, so we made our own simulation of a physical book, with page turning and hand illustrated diagrams.
What does that mean for this project?
We want to take that kind of experience to the next level with the PerC. Allowing the user to physically interact with our virtual world just screams out to us as something we can use to make something special. We want to expand upon our initial ideas for visuals here, and bring something completely new to the table with the incredible gestural and motion controls and hopefully make the user feel an almost tactile experience with it.
Physynth was a niche product and one that did have a steep learning curve, and that’s something we realised limited its accessibility – for this project we will be learning from that and creating something that anyone can use be they eight years old or eighty years old.
Right, enough background – give me the details, What are Simian Squared actually making here?
We will be creating a virtual pottery wheel, set in a beautiful location that will give users the chance to use their hands to sculpt digital clay into beautiful works of art. The user will use physical gestures and motions to mould, manipulate and then paint the clay.
We will put a large emphasis on getting it to feel as natural, and look as beautiful as possible to give the user a wonderful experience that also fully demonstrates the power of the PerC and Ultrabook. We will be leveraging our experience with rendering and physics to get the most out of the Ultrabook and PerC hardware.
So how far along are we?
Well, so far we’ve been split between doing R&D for the tech we will be writing for this project, and wrapping up on development for our upcoming platformer, The Other Brothers – we’re about ready to kick into full gear with the challenge though and will be keeping you updated on all goings on over the coming seven weeks.
We’ve taken a long hard look at the Intel PerC SDK and its incredibly exciting. They’ve stripped out the main technical hurdles and left us with a creative level playing field – and we can utilise UNITY for this – the cool part of Unity is being able to leverage an awesome workflow and great special effects.
We’ll be talking in depth about this process of using custom shaders plus sharing source code as the blog progresses, so keep checking back for an in-depth look at our creative process and source code!
There’s been a lot of news circulating around a blog post made by ‘dotBunny’ regarding the ‘cancellation’ of The Other Brothers; Thomas Pasieka addressed this in recent interviews which we thought we’d post here:
Thomas Pasieka: After me and Bjorn designed the game, we looked for a programmer and Matthew Davey (dotBunny) was trialled for that role but his work failed to meet our expectations and so we chose another developer (Simian Squared). (source)
The statements made by dotBunny regarding the ‘cancellation’ of The Other Brothers are untrue and we would like to clarify that dotBunny are in no way affiliated with the game,” stated Pasieka. “Please disregard any statements made by them. (source)
So that clears that up, we’re enjoying development of The Other Brothers and will have more to show you soon!
I’ve been working on the ultimate “feel” for TOB. I’ve been programming for over 25 years, and playing games longer, so I think I’ve got a good idea how a game should feel.
Some of the important considerations are about how responsive it is, on analogue sticks and touch screens. We want the on-screen character to do the opposite of mario:
Instead of slowing down sharply when the player moves the other direction, we slow down sharply when you reach the deadzone of the touch input or analogue stick, or when the finger has been lifted from the screen. This works naturally with the time it takes to slide a typical finger across.
The result is on-screen you see a seamless and charming little devil rushing around with sharpness and no lag to the inertia – it’s crisp and yet not stiff.
For jumping, we have some funky raycasts that help us pass through platforms. If you fall off the edge of a platform and just hit jump then, the game gives you few precious milliseconds grace and lets you take the jump. This is not something you normally notice, but it sure is appreciated on touch screens.
Another element was that we wanted it to speed up quickly, but not travel very far if the player just edges towards something. We wanted that nippy feel. Likewise, it should not accelerate too slow overall when changing direction. To do this, we use 3 different curves of tweened motion together for different times, which give the character specific rules in how it behaves with physics, and focusses on playability.
That’s all for now… the next time I post will be about the scrolling!