Just finished working on an editor extension for Unity which allows to split up a set of large sprite textures into small chunks, discard identical ones, bake them into atlas textures and then seamlessly reconstruct the original sprites at runtime for render.
This technique allows to significantly reduce build size, in cases when multiple textures with identical areas are used.
While working on a broadcast solution that used Unity to render and control a virtual character, I’ve faced a problem of capturing keyboard input events while the application is not in focus. It turned out that it’s not something doable using Unity’s input system, so I’ve assembled a C# wrapper over the Windows RawInput API to hook directly to the native input events.
Assembled a plugin for Unity engine to work with Google Drive. The plugin provides API for listing, searching, creating, uploading, editing, copying, downloading and deleting files; works with Unity version 5.6 and higher and supports all major target platforms (including WebGL).
In Breached we have a lot of digital noise/glitch effects. While we mostly used the wonderful “Sci-Fi and Glitch Post-Process” package, I’ve wanted to add a bit of uniqueness to UI noise effect and assembled a material function for distorting UI texture UVs.
It uses a bunch of procedural noise generators, panners and a texture mask to apply specific distortion pattern.
The function turned out to be quite flexible, so I’ve thought it could be useful to share it. You can grab archive with the function and texture mask here: DistortionEffect.zip
Just drop the unzipped folder to UE project and the function should become available as “DistortUV” node.
A while ago, working on Breached, I’ve been figuring out how to visualize a weird in-game creature called “Keeper”. It was something like an electromagnetic anomaly, stalking around and hunting for player.
The project is now using another engine and concept of those creatures completely changed, but I kinda liked the effect and decided to share it.
You’ve probably heard about blend modes available in image and video editing tools like Photoshop or After Effects. It is an important technique for content creation and has long since become an integral part of them.
But what about video games?
Say you need to use Color Dodge blending for a particle system. Or your UI-artist made a beautiful assets in Photoshop, but some of them are using Soft Light blend mode? Or maybe you’ve wanted to create some weird Lynch-esque effect with Divide blending and apply it to a 3D mesh?
In this article I will describe the mechanics behind popular blend modes and try to simulate their effect inside Unity game engine.
In case you are not interested in the details and just looking for a complete solution to use the blend mode effect in Unity, try this package on the Asset Store: http://u3d.as/b9w