Unity + Deep Learning Support

Discussion in 'Feature Requests' started by Amin, Aug 21, 2017.

  1. Hi

    I know that in the reference page it says that we can use MuJoCo along with advanced rendering engines such as Unity; but I believe for that we can only work with C/C++ interface. Is that right?
    I'm an AI researcher working on deep learning and most of the top deep learning libraries are currently written in Python. So a very important feature for me is to be able to make MuJoCo-Py library to work with Unity. Is that possible?

    Cheers
    Amin
     
  2. Emo Todorov

    Emo Todorov Administrator Staff Member

    Probably, but you would have to do the work yourself. There are two approaches:

    1. Get MuJoCo and Unity to work together in C++ and wrap the whole thing in Python;
    2. Wrap both MuJoCo and Unity in Python separately, and then write Python code to exchange data between them as needed.

    I don't have experience with Unity or Unreal Engine, but my understanding is that these tools expect the developer to adopt their framework (that is why they are game engines and not just rendering engines), so integrating them with Python and using them as light-weight rendering engines for deep learning may be tricky...
     
  3. Hi Amin,

    You might find Unity ML Agents to be a good solution – it provides a way to connect Python to Unity and is specifically targeted for machine learning applications. I work for Unity, so I may be a bit biased, but hopefully this helps you! https://unity3d.com/machine-learning

    Cheers,
    Ed
     
  4. Emo Todorov

    Emo Todorov Administrator Staff Member

    Incidentally, I am in the middle of connecting MuJoCo to Unity and supporting it officially. The idea is to use MuJoCo for physics and Unity for rendering. The workflow will be as follows:

    1. Design a new MuJoCo model or use a pre-existing model. Get the physics to behave as desired.

    2. Convert the MuJoCo geometry to a Unity scene with static GameObjects. The conversion will be done in the Unity editor, with a C# script that calls MuJoCo as native plugin.

    3. Use the Unity editor to make the scene look good. Optionally add objects in the Unity editor that can be exported to MuJoCo XML with another script. At the end of step 3, you have the same model with physics defined in MuJoCo and visualization defined in Unity, and one-to-one correspondence in terms of geometry.

    4. At runtime, MuJoCo is used for simulation and the resulting object poses are instantiated in Unity, which then renders the scene.

    The last step can have two variants:

    4a. Run MuJoCo in a separate executable (say MuJoCo-Py), with a socket connection to Unity (or shared memory connection, if you want to get images back for computer vision applications). In this case your executable can do whatever it wants, without any restrictions imposed by the Unity framework.

    4b. Run MuJoCo as a native plugin for Unity. This is more useful for end-user demos/games than for research, but is appealing because MuJoCo can enable higher fidelity physics-based gaming, as well as gaming that uses AI agents trained in MuJoCo simulations.


    This project is still under development, so comments at this stage are particularly welcome. Note that the same could be done with UE4 later, but I had to start somewhere, and Unity seems to be more popular in the gaming community.
     
  5. Wow. that's great news!
    Do you think it's possible to release a test version so others can test and comment?
     
  6. Emo Todorov

    Emo Todorov Administrator Staff Member

    The integration code will be open source, so every MuJoCo Pro user will be able to test and comment and extend it. It will be ready when it's ready, not before :) My request for comments above was about general comments regarding use cases etc. People will have to play with it after the initial release before they can come up with more specific comments.
     
  7. Hi, since I and my students have done this earlier for Open Dynamics Engine for some SIGGRAPH papers on humanoid trajectory optimization, I thought I should share some experiences:

    - We at first wrapped ODE engine as a native plugin for Unity. We used SWIG to do most of the work. However, SWIG is a pain to configure for anything but very simple data types, and the managed c# to unmanaged c++ marshaling easily generates heap allocs whenever one gets/sets vectors and quaternions from the simulator.

    - Debugging c++ together with Unity c# code is likewise a pain, as it's easy to cause the Unity editor to freeze and constantly killing and restarting it slows down iterative development. There also seems to be code that prevents the c++ debugger from launching the editor. Thus, one must manually attach/detach the debugger after Unity is already running. Also, as Unity autogenerates the Visual Studio project files, one can't have both the plugin and Unity project in the same Visual Studio project.

    - Unity had severe restrictions on multithreading in the past, and probably still has some, i.e., only a subset of Unity's API can be called from others than the main thread. For example, we had strange freezes because we were checking whether a GameObject handle was valid - this invoked some internal API that was not thread-safe.

    - Because of the reasons above (and some others), we have later switched to building a visualizer executable with Unity that loads a C++ "client" dll with a simple API. Now, one can basically implement all demo and simulation functionality in c++ as a client dll. We also have an OpenGL version of the same visualizer for faster iteration (all Unity standalone executables have a non-negligible initial loading time).

    - However, developing and debugging custom per-project visualizer features is cumbersome from debugging and version management side, so in the future, I think we will simply use OpenGL for development and then stream all the render commands to a file. We can then have more flexibility of loading and executing the render commands in custom Unity projects with fine-tuned skyboxes, lighting etc. This is simple and does not cause significant overhead as long as we only simulate and render fairly simple geometric primitives such as capsules, boxes etc.

    If you want to check our ODE to Unity code, it's in the UnityODE folder of this package:
    https://mediatech.aalto.fi/~phamalainen/FutureGameAnimation/SIGGRAPH2015_CPBP_code.zip
     
    Amin likes this.
  8. What will be programming language used in this interface? Will we be coding both in Python and C# to make it work?
     
  9. Emo Todorov

    Emo Todorov Administrator Staff Member

    Thanks for the suggestions! I converged onto something similar to your approach. MuJoCo is wrapped as a DLL plugin, which exports the model geometry to the Unity Editor (so people can make it look good offline) and then at runtime it generates GameObject transforms which Unity renders. So Unity has no idea that there is physics simulation going on; if you were to write a plugin that plays back motion capture data from some database, it would look the same to Unity. This will be released in about a week.
     
    Perttu Hämäläinen likes this.
  10. Emo Todorov

    Emo Todorov Administrator Staff Member

    It will be a remote renderer, allowing you to make MuJoCo models look good using all of Unity's visualization features. At runtime you will send the model pose (qpos) over a socket, and Unity will render the model in that configuration. What is on the other end of the socket is up to you -- could be an existing Python wrapper, or C/C++ code, or anything else as long as it can run MuJoCo simulations and use sockets.

    At this stage, you will not be able to interact with the MuJoCo physics simulation from within the C# environment built into Unity (except to simulate the passive dynamics in the Unity Editor to help you with visual design).
     
  11. Emo Todorov

    Emo Todorov Administrator Staff Member

    The release is out, see new forum on Unity Integration.