Updates about my experiences with game development, programming, and making awesome games.
Sunday, May 22, 2016
Spell Casting in VR Update
Thursday, May 19, 2016
Playing With Fire
I recently got my Oculus Rift in the mail and had to start playing around with some ideas. Here's a short demo of a spell casting concept. I used the leap motion from hand tracking and used opening and closing my hands to fire spells. You can also affect the speed of the missile with the velocity of your hand. When my palm opens to about 60% a spell is fired. It's a pretty fun little concept I may expand on.
Thursday, October 29, 2015
Mobile Infinite Downhill Derby VR
I recently started working on an infinite arcade style racing game which uses nothing but head tilt to control the vehicle. The turning is controlled by rolling your head slightly left and right. This allows you to also look around without steering out of control as well as not requiring the use of a controller with the Gear VR to play, check it out.
This video may be down and the old file is missing, here's a link to it on facebook:
https://www.facebook.com/anthony.graceffa/videos/10203659919217039/
I modeled the car and road pieces by myself. Currently working on a track generation algorithm.
Next on my list to script is:
This video may be down and the old file is missing, here's a link to it on facebook:
https://www.facebook.com/anthony.graceffa/videos/10203659919217039/
Next on my list to script is:
- Suspension updating the axles correctly.
- Random traffic
- Scoring system
Thursday, May 7, 2015
New Fauxtography Video
Once again, challenge post page here:
http://vrjam.challengepost.com/submissions/36226-fauxtography
Thursday, March 19, 2015
Fauxtography Progress
I've started a repository for my scripts if you want to see them, I'll be uploading everything with a delay of a few days:
https://github.com/AnthonyGraceffa/Fauxtography-VR
My favorite part of this project so far was choosing how to grade photos. I first started fiddling with some way to judge the photo based on where the subject was in the picture by doing a bunch of vector math. I originally wanted to get the user to try to use good composition and the rule of thirds with their photos but I ended up going with something simpler. Photos are awarded more points if the subject is centered ( 20 degrees or so away from the center) and how close the subject is to the camera (judged by the magnitude of a vector shot towards the target. For now this is all I'll do to grade the photo but I may change my mind in the future.
A "SnapShot" taken from the game
Finding if a "Target" was in the photo was also interesting. I originally just used Renderer.isVisible() but this does not take into account something being behind another object even if its in a camera's view port. What I ended up doing was looking at each subject that was visible, and then fired a raycast at it. If it hits, it counts as being captured. The result is if the target is obscured enough the ray may not hit it, but I'm going to play this off as "You need a clear shot" to get points for capturing a subject. Once again this is also subject to change later.
So far so good, the next things on my list to implement are the photo album, the ability to delete snaps from the game, animal AI, and finish building "Bidwell Park". Some people have requested the ability to "zoom" but since I'm building this for mobile I want it to work with only one button, maybe I could use voice commands or something, I guess we'll see.
Monday, March 9, 2015
Photo VR - Saving the Snap
I've started working on my project for Oculus' "VR Jam", an event where developers are encouraged to create a VR game or experience for the Samsung "Gear VR". I don't own a Gear VR but I thought I'd go ahead and test with the DK2 until I find someone that does. My game is pretty simple, anyone whose played the N64 classic "Pokemon Snap" will find the concept familair. You are carted around an environment and then take pictures of the wildlife around you. Your "snapshots" will be graded on things like composition, subject matter, and lighting. Since the game is a mobile experience I am designing it around using one button so bait and such probably won't be a feature, but we'll see.
I started with the "PictureTaker" class. This handles saving snapshots to file, as well as the game data used to judge the quality of the photo. I've run into one obstacle so far working with VR which wasn;t too much of an issue but it was fun to solve. Since the screen is rendering the two cameras for both eyes I can't just use Application.ScreenShot because the snap will show what you would see if you tried to view the game on a normal monitor, or this:
For saving the game data I made a sub class called "SnapShot" this contains the game data for each picture like what level it was taken in and what it is a picture of. As of now to tell this subject Im just firing a single ray from the center and pushing that objects name to a snapshot objects list of subjects List, then that snap is added to another list that will be eventually serialized and saved to file. The next trick is to make sure the saved game data lines up with each saved PNG. Here's my class so far:
http://pastebin.com/WMVrNBbu
I started with the "PictureTaker" class. This handles saving snapshots to file, as well as the game data used to judge the quality of the photo. I've run into one obstacle so far working with VR which wasn;t too much of an issue but it was fun to solve. Since the screen is rendering the two cameras for both eyes I can't just use Application.ScreenShot because the snap will show what you would see if you tried to view the game on a normal monitor, or this:
This is obviously not ideal, so my solution was to have a third camera in the scene that is disabled by default. When the fire button is used, the camera is re-activated, and aligned between the two eye cams, this camera renders to a render texture and a frame from that is saved to file as a PNG. The result is this:
Much better...
For saving the game data I made a sub class called "SnapShot" this contains the game data for each picture like what level it was taken in and what it is a picture of. As of now to tell this subject Im just firing a single ray from the center and pushing that objects name to a snapshot objects list of subjects List, then that snap is added to another list that will be eventually serialized and saved to file. The next trick is to make sure the saved game data lines up with each saved PNG. Here's my class so far:
http://pastebin.com/WMVrNBbu
Tuesday, January 28, 2014
Global Game Jam 2014 and Fusion Knights Update
This weekend in addition to celebrating my 21st birthday, I partook in the 2014 Global Game Jam. For those who aren't familiar, the GGJ is a world wide event where you and your team must create a game from scratch in 48 hours. The jam started Friday at 5:00 pm and ended on Sunday. Chico this year had 4 total teams who all produced interesting games which you can check out here . In addition I set up some camera equipment and streamed the entire process. The stream was extremely fun for everyone, you can view the archived videos here:
http://www.twitch.tv/gimblegames/profile/pastBroadcasts
http://www.twitch.tv/gimblegames/profile/pastBroadcasts
My team of 9 worked on a game called "Can Hue See Me Now?". Which was a mobile game of tag where one player is spawned as the "it" and the other players must run away and grab coins. However, we had to design around the GGJ theme which was "We don't see things as they are, we see them as we are". So our twist was that you spawned as a random color and while standing over a tile with your color you were invisible to the tagger.Unfortunatley we could now finish the game on time but you can play what we finished here:
http://globalgamejam.org/2014/games/can-hue-see-me
http://globalgamejam.org/2014/games/can-hue-see-me
and a screen shot:
I learned a lot at this jam. I worked on some of the programming but the other programmers on my team were much more skilled than I was. Jordan and August are absolutely amazing at scripting and I hope I can catch up to them soon. I learned a lot about working with SVN as well as scripting network stuff which will definitely come in handy in Fusion Knights. All in all I had a great time, the stream was a blast and I learned a ton of useful things about working together. Thanks Team!
Anyways, between Game Jam, the first week of class and my 21st (which was awesome btw), I haven't worked a whole lot on Fusion Knights although I did play with networking a little bit for getting the second player working as well as started scripting the sword attack. Hopefully I'll have at least the attack done by next week.
Subscribe to:
Posts (Atom)