注册 登录  
 加关注
   显示下一条  |  关闭
温馨提示!由于新浪微博认证机制调整,您的新浪微博帐号绑定已过期,请重新绑定!立即重新绑定新浪微博》  |  关闭

姑射道人的博客

博客新地址:nixuchen.com

 
 
 

日志

 
 

Hello Augmented World  

2013-11-01 14:47:01|  分类: android |  标签: |举报 |字号 订阅

  下载LOFTER 我的照片书  |

Welcome to the world of augmented reality and to the MobileSDK. This example application shows how a simple AR use case can be implemented. In this scenario we like to make the MetaioMan standing with his feet on a picture of himself.

Requirements

At this stage you should already know how to set up a general mobile application project in your specific development environment. To really get started with your first tutorial we have setup your workspace. The section Getting Started provides you with further details on configuring your development environment for Android and iOS.

Implementation

Assets needed

  • Model: The 3D model/geometry to be shown. In our case this is the MetaioMan, which is avaible as MD2-model (metaioman.md2).
  • Texture: We also need a texture to make the model look better. Therefore we use a PNG image which has to have the same name as the model (without the file extension).
  • Pattern: The tracking pattern/image the MetaioMan should stand on. This is also a simple image file. Here we use a JPEG file (metaioman_target.jpg), but you could also use a PNG file. For the tracking pattern the file name is irrelevant.
  • TrackingData: The tracking data configures the tracking component of the MobileSDK. It is an XML file including all relevant information for tracking with the tracking pattern being the most important one.

The HelloAugmentedWorldExample class

You can find this class in the simple-package in the example application project.

In this class the TrackingData and the textured model are being connected.

public class HelloAugmentedWorldExample extends ARViewActivity  {
    
    /**
     * The geometry to be displayed
     */
    private IUnifeyeMobileGeometry mGeometry;

    /**
     * Tracking file you like to use. The file must be within the assets folder
     * of this project.
     */
    private final String mTrackingDataFileName = "TrackingData_MarkerlessFast.xml";

    /**
     * Gets called by the super-class after the GLSurface has been created. 
     * It runs on the OpenGL-thread.
     */
    @Override
    protected void loadUnifeyeContents() {
        try {

            // Load Tracking data
            loadTrackingData(mTrackingDataFileName);

            // Load all geometry
            mGeometry = loadGeometry("metaioman.md2");
            
            // Do something with it, like scaling
            mGeometry.setMoveScale( new Vector3d(1,1,1) );
        } catch (Exception e) {
            Logger.logException(e);
        }
    }
}

Further Readings

The ARViewActivity class

If you like to get a deeper understanding of what is happening you should take a look at the ARViewActivity class, which is the super class of the HelloAugmentedWorldExample.

There are four important things to be done in the right order:

  1. Initialize the MobileSDK
  2. Setup the camera
  3. Setup a GLSurfaceView
  4. Load the tracking data.
Initialize the MobileSDK

This happens after the Activity has been created by Android in the onCreate-method by calling createMobileSDK. There is no direct class that can be created by 'new' but there is a static factory class instead. You can create a new instance by:

mMobileSDK = AS_UnifeyeSDKMobile.CreateUnifeyeMobileAndroid(
                    this, Configuration.signature);

The object returned is a reference to the MobileSDK and has the type IUnifeyeMobileAndroid.

The next important step is to inflate the layout.

mGUIView = View.inflate(this, getGUILayout(), null);

A reference of the View is saved to attach it later. Before it gets visible we need to create the AR-view first, so that it is the first element in the view.

Setup of the camera

While the initialization of the MobileSDK took place in the onCreate-method, from now on everythings takes place in the onStart-method.

Before we can activate the camera, we need a content-view. This seems to be reasons, but very important. After that the camera has to be activated by the activateCamera-method.

mMobileSDK.activateCamera( Configuration.Camera.deviceId,
                        Configuration.Camera.resolutionX,
                        Configuration.Camera.resolutionY);

We store all nessecary data in the Configuration class here. The method activateCamera returns a vector with the actual dimension of the camera image. We recommend to use the following values:

  • deviceId: Is 0 for the main camera of the device. Use 1 to use the front facing camera instead.

  • resolutionX, resolutionY: We recommend to use 480x320 as the best tradeoff between visual quality and performance.
Setup of the GLSurfaceView

Afterwards a UnifeyeGLSurfaceView for an OpenGL-context needs to be created. Additionally, you can register a callback to receive several events like: onSurfaceCreated(), onSurfaceCachanged(), onSurfaceDestroyed(), onDrawFrame() and onScreenshot().

/*
 * Create a GLSurfaceView. The context 'this' is the current Activity.
 */
mUnifeyeSurfaceView = new UnifeyeGLSurfaceView(this);
/*
 * Register the current Activity to receive callbacks. 
 * With registerCallback onSurface...() events will be avaible. 
 * setOnTouchListener lets us get touch-events. 
 */
mUnifeyeSurfaceView.registerCallback(this);
mUnifeyeSurfaceView.setOnTouchListener(this);

Loading the Tracking Data

You can load tracking data by using the loadTrackingData-method. This method looks for the supplied file name in the assets folder, gets the path and calls setTrackingData on the MobileSDK.

/**
 * Loads trackingDataFileName from the assets folder
 * @param trackingDataFileName The filename of the tracking data to be loaded. 
 * @return true on success, false otherwise
 */
protected boolean loadTrackingData(String trackingDataFileName) {
        MobileSDKExampleApplication app = (MobileSDKExampleApplication) getApplication();
        String filepathTracking = app.getAssetPath(trackingDataFileName);
        boolean result = mMobileSDK.setTrackingData(filepathTracking);
        Logger.log(Log.ASSERT, "Tracking data loaded: " + result);
        return result;
    }

In the HelloAugmentedWorldActivity class this method is used.

Registering Callbacks for Sounds and Animations

Callbacks can be registered by using registerAudioCallback for events like onAudiPause, onAudioRestart or onAudioStop and registerCallback to get an onAnimationEnd-event when the animation of a model has ended.

mUnifeyeMobile.registerAudioCallback( mUnifeyeSurfaceView.getUnifeyeAudioRenderer() );
mUnifeyeMobile.registerCallback( mCallbackHandler );
  评论这张
 
阅读(404)| 评论(0)
推荐 转载

历史上的今天

评论

<#--最新日志,群博日志--> <#--推荐日志--> <#--引用记录--> <#--博主推荐--> <#--随机阅读--> <#--首页推荐--> <#--历史上的今天--> <#--被推荐日志--> <#--上一篇,下一篇--> <#-- 热度 --> <#-- 网易新闻广告 --> <#--右边模块结构--> <#--评论模块结构--> <#--引用模块结构--> <#--博主发起的投票-->
 
 
 
 
 
 
 
 
 
 
 
 
 
 

页脚

网易公司版权所有 ©1997-2017