I started my dive into AI in 2008 writing a Boid / Crowd system for my thesis while in art college, School of Visual Arts.
It was an insane particle script + 3d animation cycles in Maya haha.
Then I did Boid movement, navigation, & obstacle detection in animated films for 5 years at Blue Sky Studios, using Houdini.
I dove into Style-Transfer AI & Long Short-Term Memory (LSTM) training in 2019-2020,
Like making a Node.js server (web site) understand my voice & auto google search for me.
Since then, I've been developing different multi-media AI structures in my spare time.
In 2015 I decided I'd cram a machine learning AI into a single-board computer, a Jetson TK1, by the end of 2026.
Something that could write down what I say,
Use vision to understand an object simply went out of frame.
Yet "knows" if it looks over, the object is still there; 'Attention'
At the end of 2023, this evolved into a deep learning AI crammed into, likely, a Jetson Nano.
As something to infer what I mean, from what I say,
Or give a "thought" on what it saw or heard in the world around it.
'Machine Learning' is AI that can learn basic patterns.
'Deep Learning' is Machine Learning,
But uses neural networks to form patterns of patterns.
Realistically, I'd just be happy to make something that can understand what I say and can give a semi coherent response without an internet connection.
As of May 24th 2025, I've started on the core of the AI,
But still testing different structure's ability in adapting to stimuli.
... It really seems like any network could work for most things, but some are better than others per task.
You could guess,
All the recent AI hullabaloo (2019-...)
Has been quite serendipitous for my creation!
Tap the Entry Title above to open the Blog Entry List.
And no, I'm not using ai to speak for me here.
These are my thoughts, how ever scattered they may be.
Dreamy Meanderings
2025-08-30
3 - 5 min read
I've been looking into dream research again.
For a while I've been planning on a meditative dream state for tensor field testing.
Testing different stimuli on the current networks state to produce outputs to test and compare with known 'Real' data like in a GAN network.
What interests me about dreams this time, is the dream-building process of dreams; which seems somewhat agreed upon by scientists.
Like the foundation of a dream, which gathers ideas like fly-paper catching bugs.
( Activation-Synthesis Theory && Threat Simulation Theory )
I've always been fascinated by dreams since I was young.
Like a little movie story generator in my brain; with super natural abilities or random scale changes
(Honey I Shrunk The Kids was quite popular back when)
Until I was maybe 20, I just assumed movie dream sequences were a stylistic choice to not bore people with their black'n'white or grayscale imagery.
That was until I asked someone if their dreams looked like the movie's dream sequences. They told me that, yeah, but their dream colors are more vivid than the movie.
Wait.... Color in dreams??
The further I asked individuals, the more I found out that dreaming of flying through nebulas while fixing a broken panel on a space ship wasn't too common.
Or running up some Kaiju monsters arm to round house kick em in the face, would really only be after they watched Attack On Titan.
...The shadow people though... I could do without the shadow people. Freaky ass, ferrofluid moving, 'blank' people...
Of course I started looking into what research existed for people dreaming in black and white or grayscale.
They say some 7-11% dream in gray, but that it's in older people; so they attributed it to people growing up with black'n'white tv. (Schredl, 2008)
I'm not even 40 yet. If my childhood would have impacted my dream colors, they'd be black'n'green like some DOS computers.
Or VGA graphics card's 256 color choices on screen, looking like Commander Keen or Duke Nukem 1/2.
Clearly there are structural differences in the brain causing these changes in types of dreaming.
I feel I should say, as it's likely important for my personal qualia.
I don't really visualize stuff in full color in my brain. I can think of a red apple, but it's 10% colored in, but accompanied with the 'feeling' of very specific shades of reds ( and yellows, if thinking of a Jazz apple )
Mentally 'felt' colors I could easily pick out in Photoshop's color picker;
Just not fully seen in my mind's eye.
If I focus harder, maybe I can fill in the mental-image of that apple from 10% up to 25% colorized, but its just a mental-visual representation of the exact color my brain was already 'feeling'.
But my dreams don't have these color-feelings to them,
Perhaps that info is lost in my memory, and I do have color associations of objects in dreams, just they don't record to my brain's meat-memory.
I'd assume, for how many people report not having an inner-eye in their brain, there would be a lot more reports of black'n'white dreams, if they were correlated.
I want to do more research into potential links between types of personal qualia, but that's a topic for another post.
It seems accepted that dreams help keep areas of the brain active, sustained neural activities, while performing neural pruning.
Activating areas in the visual cortex, theme/story concepts, and fear/debate responses, as a way to keep those good connections active while the rest of the brain is doing a nightly sweep to clean up plaque and secure neural pathways.
So instead of your brain deciding it should change the connections in your visual / auditory areas of the brain, those connections stay active to reduce neural plasticity in those areas.
( Defensive Activation Theory - Time Article )
... please tell me people at least hear stuff in dreams too, haha.
But until more research is done in this field, I'll be over here dreaming up tons of people running around in dark grey environments, while I'm rebuilding some Tolkien-esk Geiger style'd gear systems,
Or riding along side tiny ant-riding warriors,
While my brain is cleaning itself and shoring up axial connections.
research, dreams, qualia