State of decay was flawed but interesting XBLA title from 2013 with a later pc port with some additional improvements and two DLC add ons that had some different play modes and additional story.
Throughout my career, I have found myself repeating the following mantra:
Don't build an engine in an engine. Use the engine.
If you need a way to have a flexible schema in SQL, implement SQL CREATE/INSERT COLUMN etc. If you need a series of page that follow one after another then use ASP.NET with HTML hyperlinks, don't build another engine inside it.
Oculus just released their latest headset the Oculus Go. This is basically a revamped GearVr but a standalone unit with the screen integrated into it. It maintains compatibility with a lot of existing gear vr software.
I can understand why people do simple 2D menu systems. They're much easier and effective. If I was starting again then I probably would not have gone down the 3D menu route. I could bin it but I am still learning loads from it, so I am going to persevere.
Here is a montage of "not quite right" videos that chart where I was at the end of last week:
I got to the point where I was writing a lot of code in my experimental builder project. I wondered at what point I should move all of this stuff into the main project again, so I asked reddit for best practice and got some reasonable replies. I decided to move the code immediately into the main project and create a new scene.
The more I played, the more I realised that the physics felt off. Everything looked like toys bobbing around in the bath. I played with gravity and mass settings and got some hilarious results but I wanted the Cloudships to feel like they had heft. One of the things that really struck me were the cannon balls. They moved too fast, which made the scale look odd.
Unity uses state machines for controlling NPCs. I have two states on the enemy at the moment: if the player is a long way away then keep on your course, if the player is close the steer RIGHT AT IT!
That's not a very interesting bit of AI but it helps to test a whole range of things, including collision. Unity uses low poly collision meshes that are generated from your models automatically, which saved me a ton of work.
I didn't want players to hold own W all the time. I'm sure I'm not the only one that feels the strain after a day of pressing W to go forward! You now tap WS to increase/decrease thrust and the EOT moves along in time. The compass turns along with the camera (clamped to North) and the green arrow shows the track over the ground... although it doesn't seem to take into account camera angle - so I'll fix that next.
Also, the controller doesn't cast a shadow (thanks EMW).
I want to be able to support different controls eventually (particularly Xbox type thing). Unity3D takes care of a lot of this out of the box, so as long as I don't use point-and-click solely, it just works.
One problem I have is that it is difficult to tell what control effect you're having on the Cloudship. This has been exacerbated by moving the camera around. The controller will need to tell the player what's going on with the Cloudship. My plan for the controller was to group together similar things: helm, EOT (speed), compass etc on a barrel:
Keeping organised has allowed me to make best use of the limited time I get at my computer. An hour here, an hour there. I can look at videos during lunch break to fill gaps in my understanding so that I can get started more quickly.
I find that if there is a problem with a personal project, then I find it harder to go back to because it's the path of considerable resistance compared to playing Minecraft etc. The Puffy Little Clouds problem I detail below was one of those; I just forced myself to get on with it.