Leap Motion VR Support Now Directly Integrated in Unreal Engine

Leap Motion VR Support Now Directly Integrated in Unreal Engine

然是开源贡献的插件啊.  https://github.com/getnamo/leap-ue4

Quick Setup Guide

  • Create a new project, or open a project.
  • Select Window and Plugins.
  • Click on the Input Devices category and find the Leap Motion plugin.
  • Select Enabled. The Editor will warn you to restart, so click restart.
  • The plugin should be enabled and ready to use.
Enable-plugin
To try the rigged or floating hands characters, change your game mode to useLeapRiggedCharacter or LeapFloatingHandsCharacter as your default pawn.
Basic-setup

Read More

VR沉思录六- 优化这事情

关键字: Need OpenGL extension support: GL_OVR_multiview .

from intel stereo sdk.   check it here: https://software.intel.com/en-us/intel-stereo-3d-sdk

Intel® Stereo 3D SDK | Intel® Developer Zone

Features

3D Camera Generation

Use API to generate stereo 3D cameras
Just input your previous camera matrices, such as view matrix and projection matrix, to the SDK. The SDK can automatically generate 3D camera matrices and return them.

Auto Scene Adjustment

Adapts to different game scenes and adjusts disparities
Just as good stereo 3D photographers adjust their 3D cameras according to different scene styles, the SDK can auto-detect scene styles and auto-change camera settings and screen disparities. 

貌似是自动调整镜头.以及场景. 看上去很高大啊. 

Details for Intel® Stereo 3D SDK | Intel® Developer Zone

Read More

Leapmotion底座可以自己3D打印一个.

http://www.thingiverse.com/thing:625401

还有一种是这样子的.

还行哈.

Designed to replace the standard cable door on the Oculus Rift DK2, the Leap Motion Cable Door Mount lets you easily attach and remove the Leap Motion Controller without the use of adhesive. For use with the Leap Motion VR tracking beta. Further information and resources available at https://developer.leapmotion.com/vr.

For best quality, please print in FDM PC-ABS material in .005″ slices.

Alternatively, our standard mount (designed for use on a variety of headsets) is available for download here: http://www.thingiverse.com/thing:445866

Copyright 2015 Leap Motion. Provided under license. 3D CAD file for personal non-commercial use only, as is.

准备问一下 Ga大的曲率 打一下看看.

Read More

VR沉思录五- 一个镜头的缘由.

WC的时候似乎有点顿悟了. 总结了一下是以下几点

1. 为什么 VR 叫做虚拟现实 因为仿真. 所以对比游戏内容如果是写实风格,而又沉浸感十足, 的确比卡通风格的高大多了.  看过 盗梦空间的同学一定还记得那个旋转的陀罗. 是用来打破梦境(沉浸)的唯一方法. 还有一个是坠落.  而我们看着满世界的卡通人物的时候,似乎永远都知道自己并不是在真实的世界里。

QQ截图20160407073131

2. 一个镜头的原因,人只有一双眼. 所以没办法脑补分镜头. 传统的影视制作的各种手法, 似乎都不太灵光, 而且是会破坏沉浸感的。一个人怎么能够同时看到5个镜头的画面呢? 对吧。

3.    挡在镜头上的 HUD 似乎会一直提醒着玩家, 你在游戏里面.

4.   好吧,其实以上的假设是我们要追求的是高度的现实感.  退而求其次, 我们也许只要有那种足够立体化3D的视觉感 就ok了.

如何做呢?

1.   最好制作真正的交互式Ui  而不是挂在HUD上的. 通过用户的头转动和视线的射线激活的方式. 类似于 OnMouseEnter  OnMouseQuit   比如 一个显示器. 可以响应的UI是嵌在上面的屏幕. 很真实.  一个操作的仪表器 开关等等..

2.    始终由玩家自己控制镜头. 保证真实的距离感.  即便是原地不动的情况下来玩.

3.   提供真实感,通过伸手去拉电闸下来感觉会好吗?   诸如此类. 还是得亲自体验之后再做决定.  虽然没有通用规则, 但是这种体验类的设定 结合设备多测试 应该还是可以搞定的。 只不过这已经不是单纯的移植了.

526g2

这个泡泡枪也还不错吧.  所以才有了双手握的手柄。以及AK47 一样的外设枪.  这些都是真实感来源的方式提供. 赛车游戏的方向盘. 也许以后坐在家里的坦克车里玩VR对战.

Read More

what is VR views from google

VR views take 360 VR images or videos and transform them into interactive experiences that users can view on their phone, with a Cardboard viewer, or on their desktop computer. For native apps, you can embed a VR view by grabbing the latest Cardboard SDK for Android or iOS* and adding a few lines of code. On the web, embedding a VR view is as simple as adding an iframe on your site. We’re open-sourcing the HTML and JavaScript for web developers on github, so you can self-host and modify it to match your needs.

From travel and real estate to news and entertainment, we hope embeddable VR views make it quick and easy to share your story and build immersive and engaging visual experiences your users will love. We’re excited to see what you create.

*Yes, you read that right! Starting today, there’ll be a native Cardboard SDK for iOS. Provided in idiomatic Objective C, and packaged as a single, easy-to-use CocoaPod, this new SDK includes all of the features already available in the Cardboard SDK for Android.

Cool!

Read More

some tips of how to use cocos develop vr projects.

这是火石镇(CEO) 红孩儿 在 cocos 春季开发者大会的分享.

http://weibo.com/2976628362/DoXfEfgLt

可以结合着 深入的思考一下.

另外已经发布了的cocos vr 引擎已经底层接入了 Oculous Deepoon GearVR这些设备. 主要还是围绕oculous的. 而演讲中也有提到Nibiru 貌似也做很多整合工作. 后续应该会陆续放出来吧.

当然, 最大众化的Cardboard 应该也不在话下.

愿景是美好的,回到现实中来, 听过上面的录音, 你应该也会感觉差了点什么. 差什么呢?  我认为是: 缺了制作内容的工具.  因为cocos 3D部分一直都没有把Editor的功能做的很完善. 虽然底层已经可以支持了. 但是如果没有一个辅助制作工具. VR这个比3D 要求更高的事情就没办法搞了. 可视化程度太差 直接就导致制作困难.

还真是有点鸡肋的感觉. 因为接设备其实不是最难的点.  内容制作反而是.

也听闻 cocos creator 会有3D部分的计划. ( playcanvas 早就有3D编辑)  但如果现在就想用cocos做VR 只能是选择一些轻量的. 或者只是把游戏视角VR化.  但依然还是有很多细节需要处理的. HUD UI的处理. 操控. 等等都是不太一样的.

所以红孩儿也是建议开发者自己开发一些合用的编辑工具.

 

 

Read More

offical Cardboard SDK for ios released by google

Cardboard SDK for iOS

The Cardboard project aims at developing inexpensive virtual reality (VR) tools to allow everyone to enjoy VR in a simple, fun, and natural way. The Cardboard SDK for iOS enables developers familiar with OpenGL to quickly start creating VR applications. The toolkit simplifies many common VR development tasks, including:

  • Lens distortion correction.
  • Spatial audio.
  • Head tracking.
  • 3D calibration.
  • Side-by-side rendering.
  • Stereo geometry configuration.
  • User input event handling.

We’re keeping the hardware and software open to encourage community participation and compatibility with VR content available elsewhere.

To learn more:

Getting Started

This document describes how to use the experimental Cardboard SDK for iOS to create your own Virtual Reality (VR) experiences.

Cardboard is a simple device that unlocks the power of your smartphone as a VR platform. Working with Cardboard, your phone can display 3D scenes with binocular rendering, track and react to head movements, and interact with apps by activating the trigger input.

The Cardboard SDK contains tools for spatial audio which goes far beyond simple left side/right side audio cues to offer 360 degrees of sound. You can also control the tonal quality of the sound—for example, you can make a conversation in a small spaceship sound drastically different than one in a large, underground (and still virtual) cave.

The demo app used in this tutorial, “Treasure Hunt,” is a basic game, but it demonstrates the core features of Cardboard. In the game, users look around a virtual world to find and collect objects. It demonstrates some basic features, such as lighting, movement in space, and coloring. It shows how to set up the trigger input, detect if the user is looking at something, set up spatial audio, and render images by providing a different view for each eye.

Before you begin

To build the demo app, you must have the following:

  • Xcode 7.1 or higher.
  • CocoaPods. To download and install, go to cocoapods.org.
  • A physical iPhone running iOS 7 or higher.

Downloading and building the app

  1. Clone the demo app from the GitHub repository by running the following command:

    git clone https://github.com/googlesamples/cardboard-ios.git

  2. In the Terminal, navigate to the TreasureHunt folder, and then run the following:

    pod update

    The CardboardSDK cocoapod is added to the TreasureHunt project.

  3. In Xcode, open the TreasureHunt.xcworkspace project and then click the Run button.

    Here is a screenshot of Xcode with the TreasureHunt.xcworkspace project loaded:

You’re ready to use the Cardboard SDK for iOS!

Playing the game

The goal of the game is to find the cubes in 3D space and collect them. To enjoy the benefits of spatial audio, be sure to wear headphones.

To find and collect a cube

  1. Move your head in any direction until a cube enters your field of view.
  2. Look directly at the cube. This causes it to turn orange.
  3. Activate the trigger input. This collects the cube.

Overview of the code

The TreasureHunt app renders an OpenGL scene for each eye in VR mode. The sections below provide details about the following tasks:

Implement a UIViewController to host GCSCardboardView

The TreasureHunt app implements a UIViewController, the TreasureHuntViewController class, that hosts an instance of the GCSCardboardView class. An instance of the TreasureHuntRenderer class is created and set as aGCSCardboardViewDelegate for the GCSCardboardView. In addition, the app provides a render loop, theTreasureHuntRenderLoop class, that drives the - render method of the GCSCardboardView.

- (void)loadView {
  _treasureHuntRenderer = [[TreasureHuntRenderer alloc] init];
  _treasureHuntRenderer.delegate = self;

  _cardboardView = [[GCSCardboardView alloc] initWithFrame:CGRectZero];
  _cardboardView.delegate = _treasureHuntRenderer;
  ...
  _cardboardView.vrModeEnabled = YES;
  ...
  self.view = _cardboardView;
}

Define a renderer to implement the GCSCardboardViewDelegate protocol

GCSCardboardView provides a drawing surface for your rendering. It coordinates the drawing with your rendering code through the GCSCardboardViewDelegate protocol. To achieve this, the TreasureHuntRenderer class implementsGCSCardboardViewDelegate:

#import "GCSCardboardView.h"

/** TreasureHunt renderer. */
@interface TreasureHuntRenderer : NSObject<GCSCardboardViewDelegate>

@end

Implement the GCSCardboardViewDelegate protocol

To draw the GL content onto GCSCardboardView , TreasureHuntRenderer implements theGCSCardboardViewDelegate protocol:

@protocol GCSCardboardViewDelegate<NSObject>

- (void)cardboardView:(GCSCardboardView *)cardboardView
         didFireEvent:(GCSUserEvent)event;

- (void)cardboardView:(GCSCardboardView *)cardboardView
     willStartDrawing:(GCSHeadTransform *)headTransform;

- (void)cardboardView:(GCSCardboardView *)cardboardView
     prepareDrawFrame:(GCSHeadTransform *)headTransform;

- (void)cardboardView:(GCSCardboardView *)cardboardView
              drawEye:(GCSEye)eye
    withHeadTransform:(GCSHeadTransform *)headTransform;

- (void)cardboardView:(GCSCardboardView *)cardboardView
   shouldPauseDrawing:(BOOL)pause;

@end

Implementations for the willStartDrawing, prepareDrawFrame, and drawEye methods are described below.

Implement willStartDrawing

To perform one-time GL state initialization, implement - cardboardView:willStartDrawing:. Use this opportunity to load shaders, initialize scene geometry and bind to GL parameters. We also initialize an instance of theGCSCardboardAudioEngine class here:

- (void)cardboardView:(GCSCardboardView *)cardboardView
     willStartDrawing:(GCSHeadTransform *)headTransform {
  // Load shaders and bind GL attributes.
  // Load mesh and model geometry.
  // Initialize GCSCardboardAudio engine.
  _cardboard_audio_engine =
  [[GCSCardboardAudioEngine alloc]initWithRenderingMode:
      kRenderingModeBinauralHighQuality];
  [_cardboard_audio_engine preloadSoundFile:kSampleFilename];
  [_cardboard_audio_engine start];
  ...
  [self spawnCube];
}

Implement prepareDrawFrame

To set up rendering logic before the individual eyes are rendered, implement - cardboardView:prepareDrawFrame:. Any per-frame operations specific to this rendering should happen here. This is a good place to update your model and clear the GL state for drawing. The app computes the head orientation and updates the audio engine.

- (void)cardboardView:(GCSCardboardView *)cardboardView
     prepareDrawFrame:(GCSHeadTransform *)headTransform {
  GLKMatrix4 head_from_start_matrix = [headTransform headPoseInStartSpace];
  // Update audio listener's head rotation.
  const GLKQuaternion head_rotation =
      GLKQuaternionMakeWithMatrix4(GLKMatrix4Transpose(
      [headTransform headPoseInStartSpace]));
  [_cardboard_audio_engine setHeadRotation:head_rotation.q[0]
                                         y:head_rotation.q[1]
                                         z:head_rotation.q[2]
                                         w:head_rotation.q[3]];
  // Update the audio engine.
  [_cardboard_audio_engine update];

  // Clear the GL viewport.
  glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
  glEnable(GL_DEPTH_TEST);
  glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
  glEnable(GL_SCISSOR_TEST);
}

Implement drawEye

This is the core of the rendering code, and very similar to building a regular OpenGL ES application. The following snippet shows how to implement drawEye to get the view transformation matrix for each eye and the perspective transformation matrix. Note that this method gets called for each eye. If the GCSCardboardView does not have VR mode enabled, then eye is set to the center eye. This is useful for “mono” rendering which can be useful to provide a non-VR view of the 3D scene.

- (void)cardboardView:(GCSCardboardView *)cardboardView
              drawEye:(GCSEye)eye
    withHeadTransform:(GCSHeadTransform *)headTransform {
  // Set the viewport.
  CGRect viewport = [headTransform viewportForEye:eye];
  glViewport(viewport.origin.x, viewport.origin.y, viewport.size.width,
      viewport.size.height);
  glScissor(viewport.origin.x, viewport.origin.y, viewport.size.width,
      viewport.size.height);

  // Get the head matrix.
  const GLKMatrix4 head_from_start_matrix =
      [headTransform headPoseInStartSpace];

  // Get this eye's matrices.
  GLKMatrix4 projection_matrix = [headTransform
      projectionMatrixForEye:eye near:0.1f far:100.0f];
  GLKMatrix4 eye_from_head_matrix =
      headTransform eyeFromHeadMatrix:eye];

  // Compute the model view projection matrix. GLKMatrix4
  model_view_projection_matrix = GLKMatrix4Multiply(projection_matrix,
      GLKMatrix4Multiply(eye_from_head_matrix, head_from_start_matrix));

  // Render from this eye.
  [self renderWithModelViewProjectionMatrix:model_view_projection_matrix.m];
}

After returning from this call, GCSCardboardView renders the scene to the display.

The rendering needs to be driven by a render loop using CADisplayLink. The TreasureHunt app provides a sample render loop: TreasureHuntRenderLoop. This needs to call the - render method of the GCSCardboardView class. This is handled in the - viewWillAppear: and - viewDidDisappear: methods of the TreasureHuntViewController class:

- (void)viewWillAppear:(BOOL)animated {
  [super viewWillAppear:animated];

  _renderLoop = [[TreasureHuntRenderLoop alloc]
   initWithRenderTarget:_cardboardView selector:@selector(render)];
}

- (void)viewDidDisappear:(BOOL)animated {
  [super viewDidDisappear:animated];

  [_renderLoop invalidate];
  _renderLoop = nil;
}

Handling inputs

The Cardboard SDK detects events which are triggered by activating the trigger input. To provide custom behavior when these events occur, implement the - cardboardView:didFireEvent: delegate method.

- (void)cardboardView:(GCSCardboardView *)cardboardView
         didFireEvent:(GCSUserEvent)event {
  switch (event) {
    case kGCSUserEventBackButton:
    // If the view controller is in a navigation stack or
    // over another view controller, pop or dismiss the
    // view controller here.
    break;
    case kGCSUserEventTrigger:
     NSLog(@"User performed trigger action");
     // Check whether the object is found.
     if (_is_cube_focused) {
       // Vibrate the device on success.
       AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
       // Generate the next cube.
       [self spawnCube];
     }
     break;
  }
}

Read More