VR沉思录八-cocos vr 引擎现状和趋势分析

cocos vr 支持pc接 oculus 这个基本的HMD.  兼容了国产的 Deepoon.  mobile应该是主要的目标方向. 毕竟一直主打是跨平台.

cardboard 目前支持android. 支持ios应该也是问题不大. (期待吧)

gear vr  同oculus.

htc vive 没有支持.

3glass 也没有.

nibiru 一体机据说也支持. 但是没看到. 没设备我也没办法测试.

总体上看, 引擎支持VR渲染之后, 开发者门槛就降低了. 但是容易被忽视的是 与体验相关的性能问题.  加上cocos 3d本身工具链欠缺,  还会阻碍一段时间. 此次 hackson 也多是小型的休闲类.  倒是秉承了cocos引擎本身的定位. 主打小游戏.

既然支持oculus 就可以跟着oculus走, 做很多尝试性的工作了. 设备上和开发模式上,估计不会出现 UE 和 Unity 在研的 VR Editor 那种华丽的东西了.

回到开发上: 底层工作是引擎支持. 自带基本的分屏 扭曲 等.  目前 cocos 在这个级别上是cardboard为基础的.  而涉及到不同设备的自身的优化还是要靠设备自己解决驱动问题. 优化传输.  明显的例子就是:Nivida 驱动支持的 direct hmd 模式. 这算是显卡厂商做的支持. 以后类似的点应该还会看到很多.  这部分是硬件厂商的关注点. intel amd 也都在一体机模式上十分活跃.

作为开发者, 我们更关注的应该是内容制作环节.  比如合理的优化 渲染区域.   这部分也是有VR设备厂商在SDK的支持方面和我们有交集.  不优化, 也可以看. 优化 效果更好. SDK提供的API级别 和工作模式 会成为一种差异化的限制. 如果SDK做的好 也是可以弥补硬件不足 弥补开发者技术欠缺.   国产VR 这部分做的都不是特别到位.  在SDK和制作支持上有裂痕  解决这个问题的点已经出现了 就是HMD厂商会和引擎合作. 比如  oculus 和 unity  ue ,  cocos 现在也可以支持这些主流的设备了.

除了通用引擎,  业内也不乏有能力的开发者(工作室) 可以自行优化制作的工作流程.  显然这会成为核心竞争力.  毕竟做VR没有3D技术做支持 的确玩不转.  上升到制作的层面, 还是需要重新全盘考虑的, 旧的制作方法和流程并不直接适用于 VR领域.

总结一句话, 可以做. 做的好不好, 还得看内功. 或者是生态链弥补. 比如资源商店, 引擎支持, 开发者合作. 等等.   时间上来讲肯定是有个窗口期的, 赶早不赶晚.  保持关注吧. 诸君.  不怕慢 就怕站.

如果做成AnySDK的模式 也很好吧. 但和AnySDK有点冲突. . VR这块属于更高级的3D内容制作了.

 

 

Read More

VR沉思录七-别忘了声音

对. 就是声音. 估计很多人都不是十分重视声音. 音效.  比较这部分不需要花太多的精力. 几行代码就搞定了.

但是到了VR产品, 音效也变得十分的重要了。有一天发现 好的音效与差的音效 的确有差别. 一个声音用Android播放就是那种破破的感觉. 而另一个用 iPad Air播放处理很浑厚. 还有远近…

再带上苹果原装的耳机试试. 效果更好了.

有区别想必大家都早就是知道的,  不过到了VR产品, 声音会辅助提升体验,以及到底有多重要 却未必有一个准确的定位.

举个例子: 比如FPS类型的, 你不用回头而是听到来自身后的一个脚步声, 是不是很好的体验呢?

如果不是, 那么很容易就迷失了.  尤其是带上头盔之后, 眼睛被蒙住了,  游戏又没有带给足够多的信息,  玩家就会很迷茫..  找不到北..  拯救世界就靠你了。

Game SFX

 

Read More

Leap Motion VR Support Now Directly Integrated in Unreal Engine

Leap Motion VR Support Now Directly Integrated in Unreal Engine

然是开源贡献的插件啊.  https://github.com/getnamo/leap-ue4

Quick Setup Guide

  • Create a new project, or open a project.
  • Select Window and Plugins.
  • Click on the Input Devices category and find the Leap Motion plugin.
  • Select Enabled. The Editor will warn you to restart, so click restart.
  • The plugin should be enabled and ready to use.
Enable-plugin
To try the rigged or floating hands characters, change your game mode to useLeapRiggedCharacter or LeapFloatingHandsCharacter as your default pawn.
Basic-setup

Read More

VR沉思录六- 优化这事情

关键字: Need OpenGL extension support: GL_OVR_multiview .

from intel stereo sdk.   check it here: https://software.intel.com/en-us/intel-stereo-3d-sdk

Intel® Stereo 3D SDK | Intel® Developer Zone

Features

3D Camera Generation

Use API to generate stereo 3D cameras
Just input your previous camera matrices, such as view matrix and projection matrix, to the SDK. The SDK can automatically generate 3D camera matrices and return them.

Auto Scene Adjustment

Adapts to different game scenes and adjusts disparities
Just as good stereo 3D photographers adjust their 3D cameras according to different scene styles, the SDK can auto-detect scene styles and auto-change camera settings and screen disparities. 

貌似是自动调整镜头.以及场景. 看上去很高大啊. 

Details for Intel® Stereo 3D SDK | Intel® Developer Zone

Read More

Leapmotion底座可以自己3D打印一个.

http://www.thingiverse.com/thing:625401

还有一种是这样子的.

还行哈.

Designed to replace the standard cable door on the Oculus Rift DK2, the Leap Motion Cable Door Mount lets you easily attach and remove the Leap Motion Controller without the use of adhesive. For use with the Leap Motion VR tracking beta. Further information and resources available at https://developer.leapmotion.com/vr.

For best quality, please print in FDM PC-ABS material in .005″ slices.

Alternatively, our standard mount (designed for use on a variety of headsets) is available for download here: http://www.thingiverse.com/thing:445866

Copyright 2015 Leap Motion. Provided under license. 3D CAD file for personal non-commercial use only, as is.

准备问一下 Ga大的曲率 打一下看看.

Read More

VR沉思录五- 一个镜头的缘由.

WC的时候似乎有点顿悟了. 总结了一下是以下几点

1. 为什么 VR 叫做虚拟现实 因为仿真. 所以对比游戏内容如果是写实风格,而又沉浸感十足, 的确比卡通风格的高大多了.  看过 盗梦空间的同学一定还记得那个旋转的陀罗. 是用来打破梦境(沉浸)的唯一方法. 还有一个是坠落.  而我们看着满世界的卡通人物的时候,似乎永远都知道自己并不是在真实的世界里。

QQ截图20160407073131

2. 一个镜头的原因,人只有一双眼. 所以没办法脑补分镜头. 传统的影视制作的各种手法, 似乎都不太灵光, 而且是会破坏沉浸感的。一个人怎么能够同时看到5个镜头的画面呢? 对吧。

3.    挡在镜头上的 HUD 似乎会一直提醒着玩家, 你在游戏里面.

4.   好吧,其实以上的假设是我们要追求的是高度的现实感.  退而求其次, 我们也许只要有那种足够立体化3D的视觉感 就ok了.

如何做呢?

1.   最好制作真正的交互式Ui  而不是挂在HUD上的. 通过用户的头转动和视线的射线激活的方式. 类似于 OnMouseEnter  OnMouseQuit   比如 一个显示器. 可以响应的UI是嵌在上面的屏幕. 很真实.  一个操作的仪表器 开关等等..

2.    始终由玩家自己控制镜头. 保证真实的距离感.  即便是原地不动的情况下来玩.

3.   提供真实感,通过伸手去拉电闸下来感觉会好吗?   诸如此类. 还是得亲自体验之后再做决定.  虽然没有通用规则, 但是这种体验类的设定 结合设备多测试 应该还是可以搞定的。 只不过这已经不是单纯的移植了.

526g2

这个泡泡枪也还不错吧.  所以才有了双手握的手柄。以及AK47 一样的外设枪.  这些都是真实感来源的方式提供. 赛车游戏的方向盘. 也许以后坐在家里的坦克车里玩VR对战.

Read More

what is VR views from google

VR views take 360 VR images or videos and transform them into interactive experiences that users can view on their phone, with a Cardboard viewer, or on their desktop computer. For native apps, you can embed a VR view by grabbing the latest Cardboard SDK for Android or iOS* and adding a few lines of code. On the web, embedding a VR view is as simple as adding an iframe on your site. We’re open-sourcing the HTML and JavaScript for web developers on github, so you can self-host and modify it to match your needs.

From travel and real estate to news and entertainment, we hope embeddable VR views make it quick and easy to share your story and build immersive and engaging visual experiences your users will love. We’re excited to see what you create.

*Yes, you read that right! Starting today, there’ll be a native Cardboard SDK for iOS. Provided in idiomatic Objective C, and packaged as a single, easy-to-use CocoaPod, this new SDK includes all of the features already available in the Cardboard SDK for Android.

Cool!

Read More

some tips of how to use cocos develop vr projects.

这是火石镇(CEO) 红孩儿 在 cocos 春季开发者大会的分享.

http://weibo.com/2976628362/DoXfEfgLt

可以结合着 深入的思考一下.

另外已经发布了的cocos vr 引擎已经底层接入了 Oculous Deepoon GearVR这些设备. 主要还是围绕oculous的. 而演讲中也有提到Nibiru 貌似也做很多整合工作. 后续应该会陆续放出来吧.

当然, 最大众化的Cardboard 应该也不在话下.

愿景是美好的,回到现实中来, 听过上面的录音, 你应该也会感觉差了点什么. 差什么呢?  我认为是: 缺了制作内容的工具.  因为cocos 3D部分一直都没有把Editor的功能做的很完善. 虽然底层已经可以支持了. 但是如果没有一个辅助制作工具. VR这个比3D 要求更高的事情就没办法搞了. 可视化程度太差 直接就导致制作困难.

还真是有点鸡肋的感觉. 因为接设备其实不是最难的点.  内容制作反而是.

也听闻 cocos creator 会有3D部分的计划. ( playcanvas 早就有3D编辑)  但如果现在就想用cocos做VR 只能是选择一些轻量的. 或者只是把游戏视角VR化.  但依然还是有很多细节需要处理的. HUD UI的处理. 操控. 等等都是不太一样的.

所以红孩儿也是建议开发者自己开发一些合用的编辑工具.

 

 

Read More

offical Cardboard SDK for ios released by google

Cardboard SDK for iOS

The Cardboard project aims at developing inexpensive virtual reality (VR) tools to allow everyone to enjoy VR in a simple, fun, and natural way. The Cardboard SDK for iOS enables developers familiar with OpenGL to quickly start creating VR applications. The toolkit simplifies many common VR development tasks, including:

  • Lens distortion correction.
  • Spatial audio.
  • Head tracking.
  • 3D calibration.
  • Side-by-side rendering.
  • Stereo geometry configuration.
  • User input event handling.

We’re keeping the hardware and software open to encourage community participation and compatibility with VR content available elsewhere.

To learn more:

Getting Started

This document describes how to use the experimental Cardboard SDK for iOS to create your own Virtual Reality (VR) experiences.

Cardboard is a simple device that unlocks the power of your smartphone as a VR platform. Working with Cardboard, your phone can display 3D scenes with binocular rendering, track and react to head movements, and interact with apps by activating the trigger input.

The Cardboard SDK contains tools for spatial audio which goes far beyond simple left side/right side audio cues to offer 360 degrees of sound. You can also control the tonal quality of the sound—for example, you can make a conversation in a small spaceship sound drastically different than one in a large, underground (and still virtual) cave.

The demo app used in this tutorial, “Treasure Hunt,” is a basic game, but it demonstrates the core features of Cardboard. In the game, users look around a virtual world to find and collect objects. It demonstrates some basic features, such as lighting, movement in space, and coloring. It shows how to set up the trigger input, detect if the user is looking at something, set up spatial audio, and render images by providing a different view for each eye.

Before you begin

To build the demo app, you must have the following:

  • Xcode 7.1 or higher.
  • CocoaPods. To download and install, go to cocoapods.org.
  • A physical iPhone running iOS 7 or higher.

Downloading and building the app

  1. Clone the demo app from the GitHub repository by running the following command:

    git clone https://github.com/googlesamples/cardboard-ios.git

  2. In the Terminal, navigate to the TreasureHunt folder, and then run the following:

    pod update

    The CardboardSDK cocoapod is added to the TreasureHunt project.

  3. In Xcode, open the TreasureHunt.xcworkspace project and then click the Run button.

    Here is a screenshot of Xcode with the TreasureHunt.xcworkspace project loaded:

You’re ready to use the Cardboard SDK for iOS!

Playing the game

The goal of the game is to find the cubes in 3D space and collect them. To enjoy the benefits of spatial audio, be sure to wear headphones.

To find and collect a cube

  1. Move your head in any direction until a cube enters your field of view.
  2. Look directly at the cube. This causes it to turn orange.
  3. Activate the trigger input. This collects the cube.

Overview of the code

The TreasureHunt app renders an OpenGL scene for each eye in VR mode. The sections below provide details about the following tasks:

Implement a UIViewController to host GCSCardboardView

The TreasureHunt app implements a UIViewController, the TreasureHuntViewController class, that hosts an instance of the GCSCardboardView class. An instance of the TreasureHuntRenderer class is created and set as aGCSCardboardViewDelegate for the GCSCardboardView. In addition, the app provides a render loop, theTreasureHuntRenderLoop class, that drives the - render method of the GCSCardboardView.

- (void)loadView {
  _treasureHuntRenderer = [[TreasureHuntRenderer alloc] init];
  _treasureHuntRenderer.delegate = self;

  _cardboardView = [[GCSCardboardView alloc] initWithFrame:CGRectZero];
  _cardboardView.delegate = _treasureHuntRenderer;
  ...
  _cardboardView.vrModeEnabled = YES;
  ...
  self.view = _cardboardView;
}

Define a renderer to implement the GCSCardboardViewDelegate protocol

GCSCardboardView provides a drawing surface for your rendering. It coordinates the drawing with your rendering code through the GCSCardboardViewDelegate protocol. To achieve this, the TreasureHuntRenderer class implementsGCSCardboardViewDelegate:

#import "GCSCardboardView.h"

/** TreasureHunt renderer. */
@interface TreasureHuntRenderer : NSObject<GCSCardboardViewDelegate>

@end

Implement the GCSCardboardViewDelegate protocol

To draw the GL content onto GCSCardboardView , TreasureHuntRenderer implements theGCSCardboardViewDelegate protocol:

@protocol GCSCardboardViewDelegate<NSObject>

- (void)cardboardView:(GCSCardboardView *)cardboardView
         didFireEvent:(GCSUserEvent)event;

- (void)cardboardView:(GCSCardboardView *)cardboardView
     willStartDrawing:(GCSHeadTransform *)headTransform;

- (void)cardboardView:(GCSCardboardView *)cardboardView
     prepareDrawFrame:(GCSHeadTransform *)headTransform;

- (void)cardboardView:(GCSCardboardView *)cardboardView
              drawEye:(GCSEye)eye
    withHeadTransform:(GCSHeadTransform *)headTransform;

- (void)cardboardView:(GCSCardboardView *)cardboardView
   shouldPauseDrawing:(BOOL)pause;

@end

Implementations for the willStartDrawing, prepareDrawFrame, and drawEye methods are described below.

Implement willStartDrawing

To perform one-time GL state initialization, implement - cardboardView:willStartDrawing:. Use this opportunity to load shaders, initialize scene geometry and bind to GL parameters. We also initialize an instance of theGCSCardboardAudioEngine class here:

- (void)cardboardView:(GCSCardboardView *)cardboardView
     willStartDrawing:(GCSHeadTransform *)headTransform {
  // Load shaders and bind GL attributes.
  // Load mesh and model geometry.
  // Initialize GCSCardboardAudio engine.
  _cardboard_audio_engine =
  [[GCSCardboardAudioEngine alloc]initWithRenderingMode:
      kRenderingModeBinauralHighQuality];
  [_cardboard_audio_engine preloadSoundFile:kSampleFilename];
  [_cardboard_audio_engine start];
  ...
  [self spawnCube];
}

Implement prepareDrawFrame

To set up rendering logic before the individual eyes are rendered, implement - cardboardView:prepareDrawFrame:. Any per-frame operations specific to this rendering should happen here. This is a good place to update your model and clear the GL state for drawing. The app computes the head orientation and updates the audio engine.

- (void)cardboardView:(GCSCardboardView *)cardboardView
     prepareDrawFrame:(GCSHeadTransform *)headTransform {
  GLKMatrix4 head_from_start_matrix = [headTransform headPoseInStartSpace];
  // Update audio listener's head rotation.
  const GLKQuaternion head_rotation =
      GLKQuaternionMakeWithMatrix4(GLKMatrix4Transpose(
      [headTransform headPoseInStartSpace]));
  [_cardboard_audio_engine setHeadRotation:head_rotation.q[0]
                                         y:head_rotation.q[1]
                                         z:head_rotation.q[2]
                                         w:head_rotation.q[3]];
  // Update the audio engine.
  [_cardboard_audio_engine update];

  // Clear the GL viewport.
  glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
  glEnable(GL_DEPTH_TEST);
  glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
  glEnable(GL_SCISSOR_TEST);
}

Implement drawEye

This is the core of the rendering code, and very similar to building a regular OpenGL ES application. The following snippet shows how to implement drawEye to get the view transformation matrix for each eye and the perspective transformation matrix. Note that this method gets called for each eye. If the GCSCardboardView does not have VR mode enabled, then eye is set to the center eye. This is useful for “mono” rendering which can be useful to provide a non-VR view of the 3D scene.

- (void)cardboardView:(GCSCardboardView *)cardboardView
              drawEye:(GCSEye)eye
    withHeadTransform:(GCSHeadTransform *)headTransform {
  // Set the viewport.
  CGRect viewport = [headTransform viewportForEye:eye];
  glViewport(viewport.origin.x, viewport.origin.y, viewport.size.width,
      viewport.size.height);
  glScissor(viewport.origin.x, viewport.origin.y, viewport.size.width,
      viewport.size.height);

  // Get the head matrix.
  const GLKMatrix4 head_from_start_matrix =
      [headTransform headPoseInStartSpace];

  // Get this eye's matrices.
  GLKMatrix4 projection_matrix = [headTransform
      projectionMatrixForEye:eye near:0.1f far:100.0f];
  GLKMatrix4 eye_from_head_matrix =
      headTransform eyeFromHeadMatrix:eye];

  // Compute the model view projection matrix. GLKMatrix4
  model_view_projection_matrix = GLKMatrix4Multiply(projection_matrix,
      GLKMatrix4Multiply(eye_from_head_matrix, head_from_start_matrix));

  // Render from this eye.
  [self renderWithModelViewProjectionMatrix:model_view_projection_matrix.m];
}

After returning from this call, GCSCardboardView renders the scene to the display.

The rendering needs to be driven by a render loop using CADisplayLink. The TreasureHunt app provides a sample render loop: TreasureHuntRenderLoop. This needs to call the - render method of the GCSCardboardView class. This is handled in the - viewWillAppear: and - viewDidDisappear: methods of the TreasureHuntViewController class:

- (void)viewWillAppear:(BOOL)animated {
  [super viewWillAppear:animated];

  _renderLoop = [[TreasureHuntRenderLoop alloc]
   initWithRenderTarget:_cardboardView selector:@selector(render)];
}

- (void)viewDidDisappear:(BOOL)animated {
  [super viewDidDisappear:animated];

  [_renderLoop invalidate];
  _renderLoop = nil;
}

Handling inputs

The Cardboard SDK detects events which are triggered by activating the trigger input. To provide custom behavior when these events occur, implement the - cardboardView:didFireEvent: delegate method.

- (void)cardboardView:(GCSCardboardView *)cardboardView
         didFireEvent:(GCSUserEvent)event {
  switch (event) {
    case kGCSUserEventBackButton:
    // If the view controller is in a navigation stack or
    // over another view controller, pop or dismiss the
    // view controller here.
    break;
    case kGCSUserEventTrigger:
     NSLog(@"User performed trigger action");
     // Check whether the object is found.
     if (_is_cube_focused) {
       // Vibrate the device on success.
       AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
       // Generate the next cube.
       [self spawnCube];
     }
     break;
  }
}

Read More