New Android Project : VR 3D framework

So Virtual Reality has been one of my favourite subjects since I saw Lawnmover man as a kid.  I have made some very strange things in the past to try and make my own hardware / software for VR.  Oh boy where the Amiga days good fun!

So a few months ago, I made a simple split-screen 3D demo for iOS that used the Duo gamer controller, this can be found here:

https://github.com/burf2000/3DiOSDemo

I will be updating that soon to support normal iOS7 controllers so that Apple will accept it.  The Duo was added as a hack and never had a SDK 🙁

However,  due to the awesome news of Google Cardboard (VR via Cardboard!) and Oculus Rift working with Samsung to create a VR headset (uses your Note 4).  I thought it would be a good idea to start making my own one, similar to the iOS one but uses Android.

It can be found here:

https://github.com/burf2000/Android-3D-SplitScreen

I have just added support for the Moga game controller.

On a side note, this is worth a read

Getting VR to Run on Android Is “Hell,” Oculus’ Carmack Says

 

 

 

HACK24 : Submitted

2014-07-06 22.27.38

 

Finally, a month and a week late, the first version of Hack24 has been submitted.  Emotional moment, trying to write even a simple MMO single handedly is a massive task, trying to do it in a month with a family and full time job, is practically impossible.  I did it in 2 months 🙂

I will write more on it soon, enjoying a beer at the moment.

Version 1.0 supports (I wanted to call it 0.1):

Hacking, buying, entering, claiming from buildings

Scanning and Robbing players

Items allow players to teleport, scan, brute force hack, enter buildings, shield, speed up and invisibility.

I will be updating the website, and writing a how to play guide 🙂

 

 

Hack24 : Oh boy its coming!

So I had planned to release Hack24 for the end of May and as we are mid June, I failed!  Its pretty hard to make a MMO in a month part time.

Since I lasted blogged about the project  I have done the following.

  • Improved FPS even on older devices by optimising what is rendered
  • Items now spawn and players can collect items
  • Speed, Money, hacking items can now be used
  • Player can now gain XP and level up
  • Added icon

Main items left to do

  • Be able to teleport (new idea)
  • Be able to buy items
  • Quests / missions
  • Player interaction
  • Add help
  • Finish iPhone UI

I hope to at least submit to Apple by the end of the month

Hack24 MMO : We have trees!

I had planned to do more updates but I have been trying to finish the v0.1 of the game for the end of May.  To be honest it is probably not going to happen however I am making good progress 🙂 What can you do so far?

  1. Game has 10 different types of buildings/ world objects
  2. You can see other players in the game
  3. You can buy and hack building to give you XP, Money
  4. Brought buildings you can enter and you can claim money from them
  5. HUD display shows you your buildings, free buildings and other player buildings
  6. Exploring gives you XP
  7. More XP, the higher LVL you are and the better hacker you become

2014-05-28 13.00.09   2014-05-28 12.59.16 Whats next?

  1. Scan and rob players
  2. Spawn random items for people to collect
  3. Be able to buy items
  4. Implement item properties (shield, lives, invisibility etc)
  5. Fix collision
  6. Quests / Missions
  7. BUG FIXING

Using Obj / Blender objects in iOS

This is a small post about getting Blender objects in to iOS using the Obj file format.

First off, below is a fantastic tutorial on how to do it.

How To Export Blender Models to OpenGL ES: Part 1/3

This tutorial shows you how to export a Obj from Blender and convert it in to header files to use directly in to your iOS source.

The issue comes when you try to make your own object (I picked a cylinder) and convert it using the OSX tool it gets you to create.  Every object I made, crashed the tool.

For it to work you need to apply UV (surface/mesh) co-ordinates.  For someone who knows very little about 3D stuff, this caught me out.

To fix

  1. Select the object and press Tab (so the Mesh menu is available)
  2. Press U
  3. Select Smart UV
  4. Press OK.

I have no idea what it does but it does allow me to export my object and convert it using the Tool created in the link above 🙂

 

 

 

 

 

GAME JAM 2

So, on the 2nd of May we had our second Compsoft Game Jam.  I use the term loosely as we don’t follow any format but to have fun.

The first Game Jam we had was based on creating a game with 24 hours based on a randomly picked theme.  This was mega fun and spawned (from me) Disjointed Tunnels (Android) and Stupid Chicken (iOS) which are both released.  Other people made some awesome games on different platforms.  We decided 24 hours was a little hard work straight after work (yes we go to work before these) and decided the next one was 12 hours.

The other changes to this one was that instead of starting new projects (the main point of a Game Jam), most of us would continue on projects that we are passionate about.

There where 8 of us this time (from 5),  listed below are the projects

  • Hack24 (mine) is a 3D MMO game in an endless world where you hack buildings and aim to be top hacker.
  • Augmented Reality draughts (new).
  • Nought & Crosses but in a multiple grid system like Sudoku (new)
  • A full blown network / game engine thats massively scaleable.
  • A flappy birds clone
  • A postman frenzy type game (new)
  • A spaceship shooter game
  • Web based space shooter / resource game

As you can see, some of these projects are quite large which leads me on to my next post about Hack24.

The general feedback was that people really enjoyed it, I think in some ways it lacked the special feeling of creating something completely new based on a random theme however when someone is passionate about a project, they really churn out the code.

 

 

 

GLKIT Object Picking

For anyone who has played with OpenGL ES, its really hard work (iOS or Android). I have spent many hours on OpenGL ES 1.0 and 2.0, making my own frameworks etc.

Apple introduced GLKit in iOS 5 if I remembered correctly and it was designed to make the maths side easier. One of the things I struggle with is object picking, this is touching the screen and turning that in to a 3D position in the OpenGL world. Generally there is 2 ways to do this, Colour picking and Ray Picking.
Colour picking is when the user touches the screen, every object is rendered as a different colour and then you detect what colour that person touched to work out where they touched.

Ray Picking, is where you fire off a ray and work out what the first object is that it hits. This requires quite a bit of maths which people have already done for you but in different programming languages.

However, the point of this article is that I discovered that Apple have added functions to help with Ray Picking namely GLKMathUnproject

Code based from this StackOverFlow post http://stackoverflow.com/questions/20214498/updating-opengl-es-touch-detection-ray-tracing-for-ipad-retina

This works out when you touch the floor (self.plane) and then puts self.cube at that position

- (BOOL)didTouchObject: (CGPoint) tapLoc
{
tapLoc.x *= [UIScreen mainScreen].scale;
tapLoc.y *= [UIScreen mainScreen].scale;

bool testResult;

GLint viewport[4];
glGetIntegerv(GL_VIEWPORT, viewport);

GLKVector3 nearPt = GLKMathUnproject(GLKVector3Make(tapLoc.x, (tapLoc.y-viewport[3])*-1, 0.0), _baseModelViewMatrix, _projectionMatrix, &viewport[0] , &testResult);
GLKVector3 farPt = GLKMathUnproject(GLKVector3Make(tapLoc.x, (tapLoc.y-viewport[3])*-1, 1.0), _baseModelViewMatrix, _projectionMatrix, &viewport[0] , &testResult);

//farPt = GLKVector3Subtract(farPt, nearPt);

float xDif = (farPt.x – nearPt.x) / 1000;
float yDif = (farPt.y – nearPt.y) / 1000;
float zDif = (farPt.z – nearPt.z) / 1000;

for (int i = 0; i < 100; i ++)
{
if ((nearPt.x + (xDif * i)) > self.plane.position.x – self.plane.scale.x && (nearPt.x + (xDif * i)) < self.plane.position.x + self.plane.scale.x &&
(nearPt.y + (yDif * i)) > self.plane.position.y – self.plane.scale.y && (nearPt.y + (yDif * i)) < self.plane.position.y + self.plane.scale.y &&
(nearPt.z + (zDif * i)) > self.plane.position.z – self.plane.scale.z && (nearPt.z + (zDif * i)) < self.plane.position.z + self.plane.scale.z)
{

NSLog(@”Hit cube”);
self.cube.position = GLKVector3Make(nearPt.x + (xDif * i), nearPt.y + (yDif * i), nearPt.z + (zDif * i));
return YES;
}

}

return NO;

}

A free location based Augmented Reality Engine for iOS

For an upcoming project I am working on, I needed an augmented reality camera view that overlays location based data which users can interactive with. It also needed to have a radar in the top corner indicating where these locations where.

When I first looked at Augmented reality back in 2011, the main examples where based on an open source project called iPhoneARKit. I was looking for an updated version and found this little gem:

iOSARKit by Carlos Alonso. I actually spoke to Carlos who was very helpful and friendly.

For more info, please check his blog

I have made a few tweaks to this which can be found here : Burf GitHub

Disjointed Tunnels is on Ouya!

I received some good news today, my game Disjointed Tunnels was accepted to the Ouya games store.
It took 2 attempts to get in due to me not handling exiting the game properly when using the Ouya controller and I needed to remove the help that was aimed at phones. The general process and feedback was very good.

The game is completely free.
Disjointed Tunnels

3D Stereoscopic demo for iOS

So the main reason I created a GitHub account was to upload my work on BurfWorld3D (Sorry for the rubbish name).

I have a keen interest in virtual reality and I decided to buy a (they also sent me one as I signed up as a developer.) This allows you to turn your Android/iPhone in to a head mounted display giving the user a virtual reality experience for a small cost. The issue I had was that the Durovis OpenDive SDK is for Unity only and so I started the hunt to find something I could use or convert so that native OpenGL apps could be made.

I also got for Christmas a DuoGamer iPad controller which was made for Gameloft games only, however luckily someone had made their own SDK for it which I found on GitHub. I thought this would be a great way to move around in the 3D space.

So, I created a simple 3D demo where you can walk around buildings, up stairs, etc. Has collision control and gravity, still needs a lot of work but may help others.

2014-03-20 17.52.00

BurfWorld 3D on GitHub

Thanks to mringwal for the Duo Gamer SDK.

Thanks to sphereinabox For the initial OpenGl iOS split screen view for the iPad.