Depiction Engine For Unity 2023.0 (Alpha)
Loading...
Searching...
No Matches
Depiction Engine For Unity

Officially Supported versions

  • Unity (Windows, Android, WebGL)

    • 2022.x
    • 2023.x

Installation

  • Creating a Project

    • Make sure you have .NET 7 or higher installed.

    • Import Universal RP from the Package Manager under "Windows -> Package Manager".
      Create and assign a Universal Render Pipeline Asset to "Edit -> Project Settings -> Quality -> Render Pipeline Asset".
      Note
      Only Universal Rendering Pipeline(URP) projects are officially supported for the moment.

    • Under "Edit -> Project Settings -> Player -> Other Settings" set the following values:
      Color Space* = Linear
      Api Compatibility Level* = .NET Framework
      Allow unsafe Code = true

    • Import TexMeshPro from the Package Manager under "Windows -> Package Manager".

    • Import Depiction Engine from GitHub.
      Note
      A series of DepictionEngine.ManagerBase's are required in the Scene and will automatically be created the moment an DepictionEngine.Object is first introduced to the Scene. Managers can also be created manually by right clicking the Hierarchy Window and selecting "Managers" in the Context Menu.
  • Patching Universal Rendering Pipeline

    • Some important Shader Graph nodes require the rendering pipeline to be modified to function properly. The patching should be performed automatically when the DepictionEngine.RenderingManager is created however you can also trigger it manually in your Scene DepictionEngine.RenderingManager by pressing the Patch Universal Rendering Pipeline button.
      Warning
      You might have to restart Unity after the Universal Rendering Pipeline as been patched.
  • Javascript API

    • To build the Javascript API, you will have to move the WebGLTemplates folder to "/Assets/". You should now be able to select the "Depiction Engine - Demo" WebGL Template under "Edit -> Project Settings -> Player -> Resolution and Presentation".

    • Before you build for WebGL make sure all the managers are present in your Scene, if missing you can create them by right clicking the Hierarchy Window and selecting "Managers" in the Context Menu. To handle the communication with Javascript you will also need to add a DepictionEngine.JsonInterface component in the "Managers" GameObject. You will also have to save your project and make sure it is included in your build under "File -> Build Settings -> Scenes in Build".

    • If you want to control Post processing effects from Javascript make sure to read the 'Scripting' point of the Post Processing section.

Overview

Editor integration

  • New GameObject Menu items

    • Align View to Selected GeoAstroObject:
      When enabled, the current Scene Camera will always be perpendicular to the selected GeoAstroObject surface.

    • Auto Snap View to Terrain:
      When Enabled, the current Scene Camera will always have its target snap to the terrain.

    • Move View to GeoCoordinate:
      When triggered, a popup will be displayed allowing you to enter a GeoCoordinate you want the current Scene Camera target to quickly navigate to.
  • New Hierarchy Window Context Menu items

    • When right clicking the Hierarchy Window the Context Menu will now contain a new item called "Depiction Engine" where all kinds of new objects and components can be created. Look for the "Depiction Engine -> Astro" items for different planet and map presets to help get you started.

Instancing / Disposing

Double precision

  • Origin Shifting

    • Position objects at far greater distance from the world's origin with double (64bit) precision transforms and origin shifting, allowing you to create much larger projects than normally possible in Unity. It can be enabled under DepictionEngine.RenderingManager.originShifting and will be true by default.

    • For performance reason empty DepictionEngine.Object's will be automatically origin shifted only while they are selected in the Editor to allow for the manipulation tools(Move, Rotate, Scale...) to be displayed correctly while DepictionEngine.VisualObject's will always be origin shifted as they are expected to have rendered visuals as childs. DepictionEngine.Object's that do not require positioning will always be positioned at zero(origin) with no rotation(identity). If required this behaviour can be modified by any class which extends DepictionEngine.Object and overrides its DepictionEngine.Object.RequiresPositioning method to return true.

    • Multiple Scene camera layout in the Editor are possible although at significant performance cost since the GameObjects will have to be moved for every camera render.

Datasource

  • Saving

    • Only properties that have been marked as 'Out of Synch' will be pushed to the DepictionEngine.Datasource when a save operation is performed. Properties modified through the Editor should be automatically marked as 'Out of Synch' however, if need be, properties can also be marked manually using the DepictionEngine.SceneManager.StartUserContext / DepictionEngine.SceneManager.EndUserContext method. For properties to be persisted they need to have the DepictionEngine.JsonAttribute.
      object.IsUserChange(
      () => {
      //Perform property assignment.
      });

Shader

  • Shader Graph integration

    • All built-in shaders can be edited using Shader Graph.

Post Processing

Procedural generation

  • Code

    • The type of procedural object to spawn, specified by the DepictionEngine.FallbackValues, will need to implement the following method.
      private static PropertyModifier GetProceduralPropertyModifier(PropertyModifierParameters parameters)
      Here is an example of a DepictionEngine.Texture object populating a DepictionEngine.TextureModifier:
      private static PropertyModifier GetProceduralPropertyModifier(PropertyModifierParameters parameters)
      {
      TextureModifier textureModifier = ProcessingFunctions.CreatePropertyModifier<TextureModifier>();
      int textureSize = 256;
      textureModifier.Init(PopulateProceduralPixels(parameters, textureSize, textureSize, GetPixel), true, textureSize, textureSize, TextureFormat.RGBA32, false);
      return textureModifier;
      }
      protected delegate void GetPixelDelegate(PropertyModifierParameters parameters, float x, float y, out byte r, out byte g, out byte b, out byte a);
      protected static byte[] PopulateProceduralPixels(PropertyModifierParameters parameters, int width, int height, GetPixelDelegate pixelCallback)
      {
      byte[] pixels = new byte[width * height * 4];
      if (pixelCallback != null)
      {
      for (int y = 0; y < height; y++)
      {
      for (int x = 0; x < width; x++)
      {
      pixelCallback(parameters, (x + 0.5f) / width, (y + 0.5f) / height, out byte r, out byte g, out byte b, out byte a);
      int startIndex = (y * width + x) * 4;
      pixels[startIndex] = r;
      pixels[startIndex + 1] = g;
      pixels[startIndex + 2] = b;
      pixels[startIndex + 3] = a;
      }
      }
      }
      return pixels;
      }
      protected static void GetPixel(PropertyModifierParameters parameters, float x, float y, out byte r, out byte g, out byte b, out byte a)
      {
      //Add Procedural Algorithm here
      //Seed can be found in parameters.seed
      r = (byte)(x * 255);
      g = (byte)(y * 255);
      b = 0;
      a = 255;
      }