Using Obj / Blender objects in iOS

This is a small post about getting Blender objects in to iOS using the Obj file format.

First off, below is a fantastic tutorial on how to do it.

How To Export Blender Models to OpenGL ES: Part 1/3

This tutorial shows you how to export a Obj from Blender and convert it in to header files to use directly in to your iOS source.

The issue comes when you try to make your own object (I picked a cylinder) and convert it using the OSX tool it gets you to create.  Every object I made, crashed the tool.

For it to work you need to apply UV (surface/mesh) co-ordinates.  For someone who knows very little about 3D stuff, this caught me out.

To fix

  1. Select the object and press Tab (so the Mesh menu is available)
  2. Press U
  3. Select Smart UV
  4. Press OK.

I have no idea what it does but it does allow me to export my object and convert it using the Tool created in the link above 🙂

 

 

 

 

 

GLKIT Object Picking

For anyone who has played with OpenGL ES, its really hard work (iOS or Android). I have spent many hours on OpenGL ES 1.0 and 2.0, making my own frameworks etc.

Apple introduced GLKit in iOS 5 if I remembered correctly and it was designed to make the maths side easier. One of the things I struggle with is object picking, this is touching the screen and turning that in to a 3D position in the OpenGL world. Generally there is 2 ways to do this, Colour picking and Ray Picking.
Colour picking is when the user touches the screen, every object is rendered as a different colour and then you detect what colour that person touched to work out where they touched.

Ray Picking, is where you fire off a ray and work out what the first object is that it hits. This requires quite a bit of maths which people have already done for you but in different programming languages.

However, the point of this article is that I discovered that Apple have added functions to help with Ray Picking namely GLKMathUnproject

Code based from this StackOverFlow post http://stackoverflow.com/questions/20214498/updating-opengl-es-touch-detection-ray-tracing-for-ipad-retina

This works out when you touch the floor (self.plane) and then puts self.cube at that position

- (BOOL)didTouchObject: (CGPoint) tapLoc
{
tapLoc.x *= [UIScreen mainScreen].scale;
tapLoc.y *= [UIScreen mainScreen].scale;

bool testResult;

GLint viewport[4];
glGetIntegerv(GL_VIEWPORT, viewport);

GLKVector3 nearPt = GLKMathUnproject(GLKVector3Make(tapLoc.x, (tapLoc.y-viewport[3])*-1, 0.0), _baseModelViewMatrix, _projectionMatrix, &viewport[0] , &testResult);
GLKVector3 farPt = GLKMathUnproject(GLKVector3Make(tapLoc.x, (tapLoc.y-viewport[3])*-1, 1.0), _baseModelViewMatrix, _projectionMatrix, &viewport[0] , &testResult);

//farPt = GLKVector3Subtract(farPt, nearPt);

float xDif = (farPt.x – nearPt.x) / 1000;
float yDif = (farPt.y – nearPt.y) / 1000;
float zDif = (farPt.z – nearPt.z) / 1000;

for (int i = 0; i < 100; i ++)
{
if ((nearPt.x + (xDif * i)) > self.plane.position.x – self.plane.scale.x && (nearPt.x + (xDif * i)) < self.plane.position.x + self.plane.scale.x &&
(nearPt.y + (yDif * i)) > self.plane.position.y – self.plane.scale.y && (nearPt.y + (yDif * i)) < self.plane.position.y + self.plane.scale.y &&
(nearPt.z + (zDif * i)) > self.plane.position.z – self.plane.scale.z && (nearPt.z + (zDif * i)) < self.plane.position.z + self.plane.scale.z)
{

NSLog(@”Hit cube”);
self.cube.position = GLKVector3Make(nearPt.x + (xDif * i), nearPt.y + (yDif * i), nearPt.z + (zDif * i));
return YES;
}

}

return NO;

}