It’s been since 2006 that I wrote my last actual HLSL shader. A few hardware generations on I’ve started dabbling in shaders again.
Main motivation is probably my new job in Games. Where VFX and it’s mostly rigid offline-render pipeline never spawned much of a sensation of being able to experiment with it’s technologies, it just seems more accessible to just rip open shadertoy and marvel in the little snippets of art that are being spawned from often even littler snippets of code. Well and then open Unity and just start chiseling away on something myself.
Subject #1 would be water. Ocean surfaces, water surface interactions and the like.
Started with very basic Gerstner Waves and as I’m very much at home in it, I started with VEX and Houdini.
This is just a viewport capture, but I was quite alight with the fast results. Teaching VEX definitely pays off, I almost know what I’m doing now!
Well here the same as an Unlit Shader in Unity.
I did not spend all that much time tweaking this to be actually pretty. But conceptually this is something I might make use for some actual project in the nearer future.
Now Oceans somewhat sorted, I was further interested in actual surface water interaction. First thing that has been recommended to me was using Wave Particles, an old concept, but still very often used.
I only got around writing this in VEX. Particles collide so far only with a square container and the displacement function needs some improvements. But overall I can see why this could come in handy, especially for limited use-cases as puddles or swimming, where there is a limited range the waves need to carry and a simple boundary to work with.
Well this were the journey into wet has concluded so far. I’ll post more when I have produced some more results.
Due to my day job, I’m somewhat of a Houdini power user. Considering that, it’s quite late that I got to actually play with Houdini Engine for Unity. Sadly it’s just not a topic in VFX – yet.
But finally I’ve started chipping away on a prototype for a new game that requires dungeon creation and so I got around to crossing the bridge from SOPs into Unity via the Houdini Engine.
Here a small demo video:
This system is set up as a single HDA, which is driven by a bunch of input curves. The curves come straight out of unity and allow to interactively adapt the level creation.
From the curves – with SOPs lofting, booleans and a bit of wrangling – I generate the dungeon. Also setting the attributes for connecting unity materials directly. This is my main concern, as I really want to just get my world set up with a single click. I suppose I could also run a post processing script, seen the hooks for that on the Unity HDA asset.
Works like a charm! I’m sure there will be a good use for this.
After tinkering with AR for my last post, I got somewhat hooked.
Same objective, this time around. Help your tiny friend to get to the treasure. This time around you have a (conveniently heavy) anvil, a see-saw and a wooden platform you can move around.
I managed to upgrade my camera, so I could engage in a few camera angles and some compelling, story driving editing. And once again, this is all done just on my desk.
It’s quite amazing how much difference a real world connecting into the virtual scene has. Even if its jittery tracking of crinkled paper markers, that kinda obscure half the screen. There is definitely something in this approach that – with better soft/hardware – will bring small revolution in the gaming world.
And I’m not talking PS4 Shooter replacements. I’m talking playing Monopoly with your family, or Settlers of Catan. With animated characters, that come to life on your living room table.
Or – I cannot spare a World Cup reference – play the next FIFA match against your colleagues at lunchtime on the office floor, with everyone standing around the field!
Well, easly days, I’ll keep playing. Let me know what you think!
The models are partially free assets from the unity asset store> A few I modelled and painted with Sculptris. Just wanted to try it out and it’s a useful – and for the average indie game developer most importantly – free alternative for 3d sculpting and painting.
Little man has made it through the forest in pursue of a mighty treasure. Now, on top of a rocky peak, all there is left, is to cross a mighty crevasse to be set for life in wealth and fame! But what to do, the treasure seems so close but is yet so far!
Now this is the point where the player comes in, lending a divine hand by lifting a wooden board to form a bridge and let the little guy achieve his goal.
This is the next iteration of my experiments to come up with interesting use cases for AR and games. Apologies for the crappy web cam again, which makes the tracking just pop every now and then.
I want to explore, what game mechanics can be constructed around direct human interaction with a virtual world and characters within it.
This – again, is not really AR. It is a game I can play on my living room table. It does not interact with the real world, apart from waving around a marker. But it instantly felt quite captivating, having this small guy and his struggle and lending him – quite literally – a hand. As I’m doing this on a webcam that points towards me, so directions are reversed. If you look closely, I messed up and hit him in the head with the board after he crossed. And I did feel rather bad! Says something about immersion, I’d say.
Well this is a start. There are tons of things that can be improved, but as a prototype I found this quite informative.
The tech behind – in case you wondered – is once again Unity2018, vuforia and a bunch of amazing free 3D assets from the asset store.
As I have unity no working pretty well with Vuforia and AR, I’m toying with a few mini game ideas.
Still getting used to how it all ties nicely together, struggling with the discrepencies between world and content scale a bit as it messes with physics and lighting. But made progress. Spent a good half hour just playing and utterly failing to get the ball into the box.
Cleaned up the scene today and presented the whole thing a bit better. And changed some physics parameters to actually get this to work much nicer.
It’s cool to interact with the world. Still feels a little bit too tinkerish when shuffling markers around while making sure the webcam – which is taped to my desk lamp – doesn’t fall off. It’s all a matter of time until there is some nice end user interface and plain real world object tracking. That’s what I gonna start looking into next.
I started a while back on a two player snake game which I used to test out Löve2D. I’ve ported the code to Unity and set up a simple test with Vuforia.
The whole process is very straightforward. I was quite impressed how well it tracks. A well lit environment seems a must, especially when testing with a £5 WebCam it struggled a bit.
Here the clip:
Apologies for the in parts bad quality and clunky handling. Holding a marker and a phone while playing is beyond the range of my motoric functions.
For this test the interaction is quite simple, nothing particularly taylored for AR.
It lets you can play snake on any flat surface, like a floor or a table. In the video I “glued” it to a marker. Touching left/right of the phone screen steers the snake. Tap with two fingers resets.
I have a few better ideas for some cool more AR-esk mini games. Watch this space!
This is an early test using Unity NavMeshAgents and some simple animation state machine to generate autonomous little people runnig around in some rather grassy environment.
I’m thinking to use this in a later project. For now I’m dipping into all the different areas I think I’ll need, just to get a better understanding of the scope and complexity of the problems ahead.
So far this has all been rather straightforward. There are questions about how to reliably streamline animation import into Unity, keeping it all as flexible as possible.