Dev Log #12 - Build Menu and the perception of progress

I can understand why people do simple 2D menu systems. They're much easier and effective. If I was starting again then I probably would not have gone down the 3D menu route. I could bin it but I am still learning loads from it, so I am going to persevere.

Here is a montage of "not quite right" videos that chart where I was at the end of last week:

The grid of locations where you can build began life as a bunch of positional vectors but that soon became a pain in the arse when dealing with buildings that take up more than one space. I turned it into a simple grid system that converts into world co-ordinates when placing. Everything is subordinate to this grid, which I should have done from the start.

I've managed to fix the build menu this week too - the buildings scale correctly depending on how big they are. I started on move/delete. I am sort of hamstrung by not having the next release of Unity with the shader creator I showed in the last update. I need to create "semi transparent" or "edge" shaders so that you can see a ghost of where the building is going to go and that's easy in the editor but a pain in code. They are due to release it in "April", which is running out. I do believe that timeline because they're already releasing Betas for the next release, which means they've gone gold. I could still do it in C++ but my productivity won't be high enough.

I might move onto the cannon building, which I can do most of without the new shader. I have all the drawings for the cannon ready.

Perception of progress

I'm still working for an hour every morning but it definitely feels like progress is slower going. This is partly because I've needed to put in more infrastructure and that what I'm now doing is quite complicated! I'm managing to keep my tickets of a similar size and you can see that I've done a fair amount in this release so far.

I don't want to make another release until the building system is in and working properly. My weekly commit graph shows a regular pulse since the start of March, which makes me happy. It's a recognition that I get up out of bed at 0630 regardless of how shit I feel! The other two blobs to the left are me playing last year.

Comments

I've done a bunch of messing about with opengl shaders (GLSL) the most common shader programming language and that is very simple to use. Seems unity doesn't use that but uses the less common and slightly more fiddly cg/HLSL language

with GLSL you would define a fragment shader that took a uniform parameter and then changed the alpha of the colour input to the fragment maybe even a geometry shader that adds some extra stuff to make a wireframe sort of look.

With unity it seems fairly similar from their shader example https://docs.unity3d.com/Manual/SL-VertexFragmentShaderExamples.html you would add a property to the shader to specify the transparency then set that from your script using something like material.SetFloat ("_Variable_Name", Value);
then in the fragment shader adjust the transparency something like this

fixed4 frag (v2f i) : SV_Target
{
// sample the texture
fixed4 col = tex2D(_MainTex, i.uv);
col.a = _Variable_Name;
// apply fog
UNITY_APPLY_FOG(i.fogCoord, col);
return col;
}

I think there is also some stuff you need to do to tell unity this shader has transparency like adding this to the tags in the shader properties: Tags { "Queue" = "Transparent" }

pretty sure that will do the job ... in fact here is a basic tutorial for creating a transparent "hologram" shader pretty easy probably five to ten minutes of work for the basic effect.
https://unity3d.com/learn/tutorials/topics/graphics/making-transparent-s...

Evilmatt's picture

I've watched a bunch of shader tuts and the C++ isn't hard at all. Very sensible. I could also borrow the code, mash it a bit and then use that - just to keep me going - but it seems a bit like a waste, given that the node editor is coming any day now. I don't really want to code them because maintenance is more difficult than with a node editor; it's just so immediate.

brainwipe's picture

I have a cannon! Took me only about 30 minutes to model (it shows), it's not got any fancy material on it yet.

LOL, the cannon looks hilariously tiny on that image. Still, you get the idea.

What makes this special is that it's made up of separate pieces and will be able to turn and point at the way that it is shooting. The side effect is that I've broken the scaling in the menu for the moment as I must calculate the bounds of each separate mesh and combine. This was definitely a case of "there must be an easy way to..." calculate the bounds of arbitrary points and in this case, there was in the form of the Encapsulate function. The scaling has gone weird for the moment and I'm not sure why, I need to have a bit of a play. I've probably included something I shouldn't have in the encapsulation.

brainwipe's picture

While in the midst of huge painkillers, I managed to get Cannons to shoot! I've already corrected some rotational weirdness in the video below but it does work - of a fashion.

I think I'm going to "fix" the rotation and then automatically do range finding. I can get the cannons to lead the target (it's all just vectors with a bit of time thrown in) but then they get too accurate and there is no skill to it.

Extra Credits

I've been aware of the Extra Credits youtube channel for a while but now I'm going through the game design ones more seriously as I want to learn from and avoid problems. Perfect coffee-time watching; especially as I watch them on 2x.

brainwipe's picture

That is a lot of cannon balls those powder monkeys must be working really fast to load all those shells :D

they are also shooting your own ship a bit at certain angles. Are you going to allow that and introduce damage to the components exposed to self fire or disable shots where the line of sight is obscured?

Also you might want to add a small degree of randomness to the trajectory of the cannon balls. Not much maybe a few fractions of a degree off their target vector it would make things look a lot better rather than having them in the same fixed configuration as they fly.

Evilmatt's picture

LOL, you're right. I agree that a cannon shouldn't fire if superstructure is in the way. When I get into range finding, I need to find a way to cast a "physics" ray to follow the path of the as-yet-unfired-ball so that I can tell if it's going to hit superstructure. Not sure how to do that yet.

In a recent update, I added randomness to the power of the balls and that solved that problem you point out, they track slightly different paths.

Next thing to do is rotate the buildings and fix the cannon angle.

brainwipe's picture

I hope that is predictable randomness, just in case you ever get multiplayer implemented.

Bigger Rob's picture

A straight ray on the path of the balls arc would mostly give you that information since for that initial arc at short range it could be approximated to a straight line without much loss of accuracy the curve would likely be later in the flight path. It would get you most of the way there. The edge case of firing too steep an angle so the ball comes down on the ship itself could probably be solved by working out what that maximum angle is and limiting the arc of fire to prevent it, likewise trying to shoot through the deck.

So long as the randomness is on the initial conditions ie starting vector and speed then it should be fine for multiplayer since all the network replicated objects should have the same starting conditions even though those are randomized

Evilmatt's picture

Big R: yes indeed. Even without multiplayer I think that would be annoying. The randomness makes the balls land writing about half a Cloudship width (if that makes sense).

EMW: good shout. I'll give that a go. I am probably going to spoof range finding. It's not exactly accurate!

brainwipe's picture
brainwipe's picture