Ahoi there!
 

I'm ProcStack, Trancor, and Kevin Edzenga,
  Which ever ya feel like calling me!

Trancor is my pre-rendered visual graphics & diy builds
ProcStack is my coding & real-time graphics creations ('Trancor' was already taken on github)
Kevin Edzenga is just ... Me.

Everything shown here is my personal work;
  This site's code to it's 3d modeling, texturing, shaders, even pxlNav itself!
Everything the light touches! ...sorta.
  I didn't make Three.js, certainly.

pxlNav

Explore -
If you are on a computer and want to run around -
   pxlNav Example : Field Environment
     ( Note : Link will load in this tab )

Hit P on your keyboard to pause the background. Hit Y to open the GLSL Shader Editor.



What's pxlNav?

The background of this page is running 'pxlNav',
Javascript package to extend Three.js to make it more interactive / game like.
Originally made for a virtual event space named "Antibody Club" [defunct] during 2020 Covid lockdowns.

pxlNav is a player controller + rendering set-up that brings more interactive functionality to Three.js.
It's basically a wrapper for Three.js, using Three.js's renderer, character rig + animation features, to create something like a game engine... I guess.

You can use any 3d modeling software to make interactive environments called Rooms, making it a little easier to make games/interactive environments.

Lets say, in Maya or Blender, you make a scene, add extra User Detail attributes to your objects to tell pxlNav how to interact with them.
Giving you the ability in your modeling software to set up-
Teleporter colliders within scene or between Rooms, set item pick-ups, ground colliders, animated textures, and more set in your 3d modeling software of choice.
For customized object coding in javascript, mark objects as 'isScripted' and you can easily access that specific object by name in code.
Then pxlNav loads your FBX and acompanying javascript file as a Room that can be portal'ed to if ya wanted.


Origin

Antibody Club was a site we hosted virtual events, album releases, dj sets, and holiday parties available on any web browser and mobile.
Antibody Club Antibody Club [defunct] - Showing some of the, now removed, network features.
See Antib0dy.Club in action [YouTube]

Since then, society walked back into the sun and stretched their legs a bit,
The needs for a virtual event space had waned.

However, pxlNav was pretty cool, so I decided to make it easier to install for your own Three.js projects.
Aiming to turn your 3d modeling software into a level editor.

As it stands, pxlNav supports FBX files for your Rooms (the environment / scene / levels) which can be made in your 3D program of choice.
It supports rig + animation fbx files for your character animation, and a simple curve in your scene can act as your camera path ( add another if you want the camera to look somewhere as it's traveling the path ).
Animated textures by using a second texture, particle effects, FPS navigation, obstacles, items, portals, and more.

3d for web has come a far way, and it has less hurdles than getting your game ready for cross platform usability.
Websites are just there, anywhere, and can be turned into apps easily these days.
So why not?


--
During the run of Antibody Club (2020), I did have a helping hand on a couple shaders, camera animations, and some other odds'n'ends.
Much of which has been altered since, but thanks none-the-less!
So, I'd be remiss if I didn't mention Michael Lee & Charles Wang for helping on Antibody Club / pxlNav.


Repos to Check Out-

procstack.github.io
Eyyyyy, check out pxlNav! I guess this repo is more of an example of using pxlNav before I get it tightened up for public use.

procPromo Minecraft Shader Pack
A shader pack for Minecraft, used through Optifine.
Cloud pillar! Azure Hamlet in the Sky For the pack, I developed a depth-based smart texture bluring, in attempts to capture the Minecraft Key or Promo art used when new updates are released.
The alien fungal bloom has spread! The spores have spread!
I also wrote a shadow distortion & bias system as well;
  A per-axis shadow distortion, with biasing based on distance from camera/player,
  Also project block sides out to sample the shadow at a distance, letting shadow cascade down the side of the blocks too.
A cool day in Palm Springs! A chill day in Palm Springs

pxlVisualizer
Trippy visuals, using OpelGL and C++ through OpenFrameWorks.
Originally written in Python for PyGame, decided it would be a fun project to learn C++ on.

pxlTextGenerator
Text to handwriting generator
Created to capture the personality of one of my characters, Diece; the very one who lives in the cabin of Metal-Asylum.net.
Letting them write the very words written in the tome perched upon the desk.
A segmenter, labeler, and scripting language was written to allow saving individual characters, variants, and written pages, with text effects like opacity, scale, and kearning.

Scan some writing, click the letters, adjust the spacing, type your page with those letters, and hit save!

Personal Projects / Socials / Links -
My main site - an earlier version of pxlNav,
But this time you can walk around and jump - Best on desktop

It changes every couple'a years, never quite getting to a state of being “done,” yet always a shotgun blast of different dev work.

Currently a real-time first-person Cabin in the woods, well, a forest with less trees than desired...
Since it's pxlNav, and like this site, if you hit Y on your keyboard, the shader editor will pop up.

A full drawing/painting app on desktop, a fun fidget toy on mobile.
On'n'off multi-year project (2014-2019) to make a full fledge drawing site with layer support, brush types, vector tools, brush effects (cpu based pixel effects, this was pre-learning about opengl shaders or webgl)
TV was king! vv Watch the timelapse of me making 'TV Kid' in Pxlmancer vv
Use the mixing pallet to blend colors together to paint with, then save the pallet by pressing a number key.
All of which can be saved to a .pxlm file; to store your layers, settings, and pallet swatches!

Sadly, an update in javascript broke saving images, and I haven't had the time to fix it yet...
You'll need to drag the image from the "Save Image" screen to a new tab to view and save it.
But you can easily save your document to a pxlm file, and open it back up to continue working on your project!


A little tid-bit, pxlmancer.com is the origin of my 'pxl' namespace for my projects.
It's my indicator of a codebase with a visual output; such as pxlNav, pxlVisualizer, pxlTextGenerator, etc.

More about pxlmancer.com on Imgur!

Cat Tax of course! ...of course, the cat tax...
*More fun on phone!*
     A few day project.
   Just a page you can play around with particles. A random project to make a custom emitter and particle class structure in javascript.

*Use on phone!!*
     A few day project.
   A custom photo filter site with interactive color and edge effects opengl shaders, using three.js for gpu access, and your multiple phone cameras in-browser.
   Tap the triple down arrow to change the filter. Tap and drag left & right or up & down to change the current filter's hue & saturation or edge detection size & brightness

Socials-
The One'Offs-
  Shadertoy
Most of the shaders are utilitarian, but click dat cat!
'Utility' like an exploration into the Kuwahara Filter.
Or a Ray Marching shader to learn about sdf math.
Dwitter  
JavaScript code golfing
When all you get is 140 characters,
Better make each one count!


Blog -


About Me -

I think I'm a 'Technical Artist' (when doing real-time work) or a 'Technical Director' (on films) or a 'Creative Technologist' (for immersive)...
So I says, blue M&M, red M&M, they all wind up the same color in the end. - Homer ... Simpson

Ya know... I don't really know what I am.
I just know I like figuring out puzzles, and for the life of me, can't seem to stop my fidgety fingers.

Always gotta be tapping away at some code, or building some diy contraption, or 3d modeling, or sewing, or writing, or... well, ya get the idea.

An undiagnosed something-or-another, which spawned-in with the energy only befitting a gift from the mythical Red Bull itself!

vv Check out my Technical Art reel vv

In the past, I worked on 10 films, 9 of those at Blue Sky Studios.
Doing Character Simulation (for hair and clothing sims) for Epic, Rio 1 & Rio 2.
Effects (volume sims, particles, and some RBDs) on Ice Age 4.
Along with 5 years of Crowds development/navigation/sims in Houdini for Ferdinand, Rio 2, Peanuts, Ice Age 5, & Spies In Disguise.
Being part of 2 published Siggraph papers and was allowed the opportunity to speak at Siggraph 2015 to a bunch'a peoples about camera based crowd navigation for Peanuts.

After that I was the tech lead + fullstack dev on our small team for Antib0dy.Club / pxlNav in 2020.
Then a couple non-pxlNav virtual events, including a St. Jude + GCX virtual fundraiser, where I worked on WebSocket networking + chat room management between Unreal Engine and our server.
Intermixed with some xr/immersive reality work; like a couple official Instagram filters and 3 Home Environments for Meta's Oculus headset, among a few other things.


Recently I made a tool to turn any 3d model into a fabric pattern in Houdini, Daryll getting filleted! Daryll all done! So I've been making custom plushies in my free time! Frank the Fish on his Pinky Flamingotube! Frank The Fish chillin out on his Flamingotube

Outside of that, I'm on'n'off working on Graph Attention Network artificial intelligence.
I've been working on a general-purpose neuron that adjusts its own connections during prediction.
   I call it a "model-less" ai network, even though the model is just dynamically generated based on input data.
It's the Structure which derives regions of neural activation based on stimuli, like the Butterfly Effect echoing through nature.
   Forming a result (prediction) after subsiquent activations, as-though compounding ripples in a pond.

Rather than a grid of numbers aligning to yield a prediction, it's a data structure which outputs a value due to the neuron connections.
   Realistically, the output should be similar to a Recurrent Neural Network (RNN), but with a different mental structure.

...Mostly they are used for "Recommendation Systems",
Hey, you might know Jim Bob McGee!!
But could be used for so much more!

So, all this new AI stuff has been quite serendipitous for me!

How about an ESN AI I wrote in the summer of 2024?
An ESN or Echo State Network is a type of RNN which considers time in it's prediction.
It thinks about past events to predict future events.
Since the brain learns on the fly, why not feed it some videos I made?
Upper left are some videos I made, upper right are it's 'levels' of learning R-G-B,
Where red are known patterns, green are the edges of the patterns, and blue are the "less likely" patterns.
Then on the bottom are two slices of the patterns the brain thinks its seeing and then predicting.


Different types of slices from the same ESN, just different input video.
Upper left is another video I made the AI is watching, upper right is shifts in detected movement,
Lower left is a slice of the brain's learned wrinkles, lower right is predicted movement.
Currently it doesn't use the predicted movement for anything,
The next step would be introducing a base image to motion-transfer / referece.
So I'm just learnin' while watching my ai learnin'!

With a "reservoir" of 15 times steps, you'll notice about every 15 frames the brain shifts.
By frame ~45, it's learned some patterns
The brain seems to completely melt at ~75 and rebuild itself by ~95.

It should be happenstance that the brain shifts when the reservoir fills,
Could mean I'm not correctly accounting for high p-values, outliers;
But it's detecting patterns in motion!
Custom ESN Learning Gradients A slice of the ESN's brain by frame 101 of watching the X pattern video.
Since I didn't have a good use case for the ESN in Python,
   I built a similar ESM through C# in Unity to operate NPC's ability to learning player habbits.
The logic is pretty simple, so running by CPU is fine for now,
   I'll likely look to move it to GPU in the future, if need be.
I set it up to learn less often when the player is in another room.
   Thinking that I'd want to set up a "data transfer" between characters,
     Gossip about the player.
   But it's just cubes and spheres in Unity at the moment.

If you couldn't tell by now, I'm training my AIs on my own creations.
A personally made AI trained on personally made images / videos / photos / code / writing training data.
   That means I can copyright my generations, right?
     If I made every aspect of the AI and training data?