WebGL & canvas workshop DevState April 4, 2014 at 12:51 pm

Excited! Upcoming Beyond Tellerrand conference I´ll be giving a WebGL / HTML5 canvas workshop together with my DevState´s buddy Sakri Rosenstrom. Sakri will do, train and cover the HTML5 canvas part - not least cause he´s really digging into making nice little (text)effects and snippets for canvas lately: You can see a bit of his collection via codepen.io - or his latest tribute to 80´s oldschool series like this  ‘He-Man´ish‘ texteffect.

I´ll be training the WebGL part. This will cover:

  • Basics in working and setting up WebGL
  • Creating your first flat triangle to your first textured cube
  • Working with vertex/fragment shaders
  • Advanced techniques and beyond…

…and some more artistic shaders and effects (like the cross-hatched ‘DevState´d’ shader below), raymarching and maybe some soundrelated demo-stuff (like this realtime soundreactiv little thingy).

So I wrote a little shader - quasi as a little appetizer and to warm up for this workshop. How about having some nice little shader, plus an artistic gfx effect to make it look sketched and drafted, plus free rotateable text (via mousedown & move), plus all the source-code you need to experiment yourself? Here you go:

Sketchy DevState realtime toy
Launch
(need to have a WebGL enabled browser!)

Btw, needless to say that once again being part of such a great conference like the upcoming ‘Beyond Tellerrand‘ confy, seeing all those phantastic line-up of speakers, is a mayor honour. And ‘hey!’, there are still tickets available for our workshop

Any questions? Need some more infos? I´ve setup a little Pagie for therefor:

DevState´s bt-confy workshop
Launch

It only remais for me to add - get your WebGL / HTML5 canvas workshop ticket soon & see you!

Distance estimated 3d fractals. March 25, 2014 at 1:25 pm

The last months I`ve spent nearly all of my time on developing ‘Pagie‘ for the cool clique at Psykosoft. And it was and still is so much fun, because we have so many features still to come, new ideas and of ’cause of being private beta now - problems to solve too on ‘Pagie‘. Meanwhile, scripting own ideas, snippets or experiments aka just play with gfx and code came off badly, so last night I took some time to write at least one little shader-experiment.

Time to play with WebGL and shaders again! The result: Some distance estimated 3D fractals - and to showcase the example, a little showcase-Pagie for it…

Distance Estimated 3D fractals via Pagie
Launch
(need to have a WebGL enabled browser!)

I know that´s not rocket-engineering any more, since the past four years, the 3D fractal field has had a massive impact and growth due to Mandelbulbs, more interesting hybrid systems like Spudsville or diverse Kleinian systems. But still, having WebGL becoming more and more widespread on browsers and even mobile devices these days, it´s really satisfying writing shaders again that can be accessed online (Shadertoy ftw!)

Distance Estimated 3D fractals
Launch
(need to have a WebGL enabled browser!)

And after all: It feels so good to play & experiment again, and to reactivate this blog from being a dusty dead end to be a bit more up-to-date. Stay tuned, there´s more to come soon…

Pagie beta March 19, 2014 at 3:38 pm

This is madness… ehhhhr… NO! This is Pagie!

Finally, as promised during my this years FITC Amsterdam session we are ‘Private Beta’ now with Pagie, one of my beloved most fav longterm projects I´m developing as lead architect and frontend developer (Javascript/HTML5/CSS3) at Psykosoft.

It´s been almost one year now since I´ve joined the very talented and decent crew of slightly skilled creative coders at Psykosoft. Second reason for that was: Being part of a team that contains among others Mario Klingemann and David Lenaerts is pure adrenaline. First reason was and still is: They promised challenging tasks and I´m still enjoying solving `em…

Pagie Beta
Visit website & become a beta-user.

But what the heck is Pagie? Well, to list just a few features at this state of release:

  • Simply drag´n´drop images, pdfs, music, whatever files you like from desktop to Pagie
  • Type and edit text directly on Pagie´s stage
  • Scale, rotate, transform, re-arrange your stuff the way you want
  • Copy´n´paste images directly online from web f.i from Google-images to Pagie
  • Embed your favourite Vimeo or Youtube videos, or Google maps, or Shadertoys
  • Generate pixelperfect responsive pages - to be shown on any screen/any device
  • …and many more to come…

To give you some 1st impressions and as proof of concept - I just quickly (~15 mins of work) edited/composed an example ‘CV’ -page for myself with Pagie:

Pagie Beta
Launch.

As I already said - this is an ongoing longterm project so we are constantly polishing, refining, debugging and adding features day by day. Feel free to join and help us as beta-user, just drop us a line (by pressing the contact button at pagie.net) and get access to Pagie 1st!

Bring the noise - yes, I do websites too… March 14, 2013 at 3:41 pm

Lately, agencies and potential clients often asked me if I accept ‘normal‘ projects as well, meaning if I can concept, layout, design or realize websites, microsites, games, mobile developments or social media related gadgets for instance too - SURE THING!

Yes, I do websites (and evil Javascript) too! :)

Ok, looking at my work and projects I´ve done in the past, the impression is created I only accept sound related visualizations (ok ok - let´s say dupstep related…) - but that´s simply not true. To the contrary I´m always in search of interesting job inquires of all kinds, be it a concept or a design for a website/microsite/ad-campaign or such, or 3D related programming tasks, or social media related gadgets, or mobile developments, or effects and filters for all kinda applications or simply the realization and programming of a given task or project.

To give you a various selection of my work - please visit the new category of my blog called ‘work‘ (no shit - it´s about my work)

Nevertheless I´ve to point out that I just released videos of another project I´ve done in the past, and ehrrrrrrrr… ‘yep‘, it´s sound related, and ‘yep‘ it´s dubstep again…

I was asked by Image-Line to design and develop a GLSL based fragment shader as part of new ‘ZGameEditor Visualizer’ release
to underline the enhanced features and usage of shader based effects within the new program version.

Get more infos here.

visualization videos via Vimeo
More infos

Apart from that I developed two ‘demoscene‘ like soundreactive realtime demos, both performed in one single
fragment shader, showcasing the possibilities and power by using GLSL based shader for visualising sound.

1. Demo ‘Shapeshifter         View Video
2. Demo ‘Blueberry Shake   View Video

So don´t hesitate to just please contact me about any projects, ideas, realizations, concepts, collaborations, commissions or
exhibitions, or even just to say hello!
 

JJ… the doomed filter February 21, 2013 at 4:02 pm

A while ago I was asked to develop and design a set of filters (GLSL based shaders) that could be used realtime and be projected live on a screen with MAX Jitter during the stage show of rapper JJ DOOM´s ‘Key of the kuffs’ album release. The filter should take the rapper on stage during the show as input and generate some nice to weird looking ‘comic’ like effects of him as output.

So much for the theory. Unfortunately the filters were dropped just before the end of completion ´cause of - ehrrrr… let´s say the timing for testing the whole scenario was a bit tuff - a.k.a ’shit happens’.

Now I decided to extract a part of the origin filters and to rewrite it as a little finger exercise with WebGL, Javascript to make it suitable for online browsers. Doing this was fairly easy, because I nearly could 1:1 port my openGL GLSL to WebGL. I honestly got lots more struggles with all the necessary Javascript around it, like writing scripts for:


- Webcam input - which isn´t supported natively in every browser so I used Flash for that
- Passing image data from Flash to Javascript
- canvas to image conversions
- Enabling a ’save to disk’ option in pure Javascript to save screenshots


But at the end, everything fell into place and was working as expected:
WebGL based image processing
Launch


K´ …it´s working not as expected if the Internet Explorer is your weapon of choice - then you got no WebGL at all, but in case you use an modern up to date browser you can try out this little project here.

Enjoy.

Wortgebilde word creations December 18, 2012 at 5:55 pm

Achtung, Achtung! Beware: Tales from the Crypt ahead… or, I absolutely suck at blogging this year! In fact, there´s so much to show, talk about, preview, or tell from actual projects, experiments and work I´ve done so far this year, but damn fact: Me is simply missing the spare time to write about all that stuff.

So what´s new? Recently I even managed to successfully maximize that leak of time by joining the FMX festival board as curator for the ‘Realtime Graphics/Interactive/Flash/Demos/Whatever‘ -slot. Meaning, I have the honour of inviting lotsa stunning entities from the creative coding front (hint! already confirmed some usual suspects as well as some promising heavyweights from the scene) - which means  I did a changeover in case of FMX and will not speak myself but moderate there. Wanna know more, stay tuned!

Beyond that I found out that finding time to experiment again <u>and</u> write about it is quite simple: Just take some vacation and you can do personal work again. Easy! So whilst revisiting some old image processing works I´d done years ago and trying to invite Jared Tarbell, one of my personal heroes when it comes to super-inspiring procedural driven art and experiments, to FMX, I digged out one of Jared´s old experiments called the ‘Emotion Fractal‘. Still one of that works that I like most…

I decided to quickly update the setup of Jared´s original source-code for the ‘Emotion Fractal‘ and used it for some image processing stuff on photos in a book that I prepared as Christmas gift. With that said I just released this little remnant for you to play with. Below an example directly from the workbench:

Cola Claus
Launch image processor

How does it work:

  • You can upload your own images (2048 x 2048 pixel !!!upload may take a while depending on the image-size)
  • Modify the chain of words to work with (separate each new word by using a blank)
  • Take snapshots from the working progress or/and download the completed composition to your disk

You can test and play with the whole enchilada here.

Last but not least - Happy holidays!

Mimic scale9-grid on GPU Shaders in their natural habitat #2 August 7, 2012 at 10:27 am

I know, I know - the frequency I´m keeping my blog up to date with new posts during the past few months is almost fast as snail pace…

But lotsa interesting tasks and projects I´m working on the past months simply unequals the time it takes to write about it, but I pledge to improve! Well, again, things I´m working on are mainly tasks dealing with writing Shaders and  programming GPU accelerated stuff - all in all a huge playground for me.

And since the Adobe Flash Player 11 has GPU acceleration too, there´s the challenge to rethink many build-in features of Flash here for the need to implement em on the GPU too. In case of 2D GPU driven graphics it means, say ‘bye bye!‘ to your beloved comfort of having a display list and many legacy features like vector masks and scale-9 for instance, going GPU means rebuilding this from the scratch again.

K´, so a very specific optimized 2d renderer is one of the tasks I´m working on the last months, therefore I needed to rebuild features like nested clips a.k.a nesting and compounds. Another requirement was to get back the Flash´s well-known Scale9-Grid feature fit for use on the GPU.  So ‘Pop! goes the weasel‘ - wanna have back scale9 grid for GPU?

Introducing ‘Vertex9‘ - a GPU based vertex solution mimicking an accurate scale9-grid:

Vertex9 ...mimic scale9-grid on GPU
Launch

Btw. up next in September I´m speaking at upcoming ‘Reasons to be Creative‘ festival in Brighton. There I´ll show all kinda actual stuff I´m working on right now - so just in case you don´t want to wait another half year for a new blogpost here, maybe it´s an pleasurable occasion to grab a ticket and see you there?

Shaders in their natural habitat. episode 1 April 3, 2012 at 10:37 pm

Feels like ions of years have past since I wrote some´ here (shame on me for being such a lazy bum) - but finally…

First of all, lotsa things changed during the last half year. Flash is deadehhhrrrr not so dead yet, HTML5 becomes the superperformant ‘jack of all trades device’ (oh yeah!) and WebGL runs on the IE (NOT). But seriously, personally I decided to give it a try and going freelance with my pet subject… a.k.a taking client-orders of projects that are in the need of Shaders (gimme gimme - frank [at] prinzipiell [dot] com).

Programming ‘just’ Shaders, wtf !!! (you´d almost think…)

But believe it or not, nowadays, it´s hardly not possible to imagine 3D-Engines/Games, programs or projects without the use of a single Shader. A Shader… what is it? To put it simply: Shaders are simple programs that describe the traits of either a vertex or a pixel (fragments). Shaders are used to program the graphics processing unit (GPU) programmable rendering pipeline and are used to primarily calculate rendering effects on graphics hardware with a high degree of flexibility.

Fair enough, but lets specify a little scope of application: Back in the good old days graphics were done in software (Hello Flash10). This might be flexible as well, but is not really performant and not very efficient because the fixed pipeline is rather limited. Something as simple as Phong-Shading already requires a shader (the shading capabilities of the fixed pipeline end with Gouraud-Shading). The possibilities that open up with shaders are huge. Some of them are: better shadowing-techniques, ambient-occlusion, normal-mapping, etc. … in other words and to say it exaggerated (!!!) modern 3D/2D-Engines/Games would look a bit lame (not to say like crap) without Shaders.

In my case, when I´ve some spare time left, I like to program more ‘experimental’ stuff with Shaders. Sound-reactive visualizations driven by WebGL f.e., like that little thingy here:

Artificial
Launch
(needs an WebGL enabled browser)

Or image processing stuff by combining Flash and Pixel Bender (Adobe´s ‘almost’ on GPU but in case Flash <11 on CPU shading solution):

big-ass Voronoi via Pixel Bender
Launch

Usually I don´t care for the language in which a Shader must be programmed (GLSL, HLSL, Pixel Bender shizzl or Assembler, sorry, I meant AGAL assembly source code for the use with Flash11), because it all depends on the target environment. When it comes to offline tasks, there are two top dogs to go for: The official OpenGL and OpenGL ES shading language is GLSL (OpenGL Shading Language), and the official Direct3D shading language is HLSL (High Level Shader Language). In case of online-usages for the internet, you can program GLSL based Shaders via WebGL for example… and, “um, errrrr…”, you could use Flash as a weapon of choice which now has GPU acceleration too (thank you Adobe, by nature the average Flash Developer likes to deal with Assembler code…).

So, “Yes M´am…”  Shaders for 3D stuff and games, but when it comes to ‘real’ projects, Shaders can be more than helpful too, doing all that nifty tasks that need performance and speed. As an example, the ‘Samsung Notassocampaign I was envolved in last winter. The nice people from the Amsterdam based agency Code D’Azur contacted me to write a bunch of Pixel Bender-based plugins to use with Flash for that project, that will do all the heavy and time-critical image processing stuff: Like automated face detection, automated cutting out detected faces from the image´s background and transform every single copy into a unique piece of art (kinda). Donkey work(!), ´cause cutting out faces automatically by preventing as much face-details as you can, a.k.a hairs, contours, neck a.s.o is not trivial at all, and doing this in Flash10 Pixel Bender non-GPU-accelerated truely IS madness… but in the end, manageable! We separated the Shader-tasks into two Plugins, one to be called ‘The Scanner’, which does all the face-detection and validation whether we detected a good face-shot or not. The second Plugin I called ‘The Composer’, which does all the hard work distributed on many little single Shader tasks (like one for edge-detection, one for skin prevention, one for the actual cut-out of the face, one for contour recovery, and many more…)

You can give it a try here:

Outputs from the Notasso Composer Plugin
Launch

That´s it so far… the idea is to continue this article by giving you an extract from my last projects and works and to show the diverse usage of Shaders there. So this was ‘Episode 1‘ - up next… Shaders really like Dubstep, watch out!

Edit: Wanna see and know more about Shaders? Me is doing a session at upcoming ‘Beyond Tellerrand‘ conference in Cologne (is the place to be) bringing along all new fresh stuff and experiments, so maybe see you there…

*Resources: http://tinyurl.com/c67sn8m & http://en.wikipedia.org/wiki/Shader

Beatbox. October 31, 2011 at 12:49 pm

Busy days these days - filled with working on lots of exciting projects & ideas for the last weeks. One of these projects is another full visualization of an Dubstep tune (the realization of the 1st tune I showed you as realtime demo at this years Flash on the beach during my session) running in realtime. Being quite happy with my result for the tune, the record label once again gave me the freedom of developing the whole visual set for the second tune on my own *muhahahaha* - so I´ve the total freedom to go for every form, color, structure & stuff I can imagine. Simply amazing, but whilst doing all this wonderful projects the reverse of the medal is, you simply got so less time to decouple little snippets and experiments to post here…

True… by now: Lastly I took some time and uncoupled a snippet with a little audio-reactive shader sequence from my second tune visualization I´m currently working on, to test it separately. Best practice to test several options and possibilities and most perfect to use it as blog-post and sign of life here!

Well, by knowing my work you probably won´t wonder that it´s all about raymarching and deforming/displacing implicit volumes and surfaces again. Btw., want to dig deeper into this  dark matter? Inigo Quilez released a brilliant article with all the basic info´s you need to get started. But for now - lets take some WebGL and  get this little audio-reactive thingy going…

Achtung!!! May take a bit to load the sound first…

Beatbox
Launch
Side note: Best viewed in Chrome, Firefox will do as well but takes ages to validate the shader…

What else, what else… hmmm… hell yeah - just got invited to once again speak at FITC Amsterdam 2012, be sure I´ll come there with a new session topic (called: Highly Illogical) bringing along all the new FP11 & WebGL stuff I´m working on right now. ‘Early Birdies‘ are already on sale, so be quick and see you there…
 

Does not compute. August 3, 2011 at 10:39 am

Phew! There´s nothing more frustrating if thinks don´t work out the way you expect ´em to. Seems that lastly anyone but me was able to use this little piece of code written in Python by Mr. Doob right from the start. All I got by trying to run it, were tons of error-messages whilst compiling. But actually this script is very handy for effectively ‘analyzing’ audio-files for the use of then syncing them f.e. via WebGL. So lets quickly update this few lines of magic for the use with Python 3 (which I´ve installed) instead of Python 2 (Mr. Doob´s actual code is based on it). Meaning kick out xrange and replace by range and set brackets when using print(). And, most important if you don´t want to end up like me just having half a brain - check the file´s encoding before wondering while it´s still not working (or just be lazy and download the updated code here)… Hereafter, assumed you already have Python installed on your machine, open Windows command prompt, switch to your projects folder and simply execute audio.py like this:

Once you got this running lets leave this bone dry parts and switch over to the more exciting visual fun parts - meaning let´s do some with our just received audio informations and write some audio driven stuff in WebGL. As for instance following little thingy I programmed last night and called it cause of lacking a better word ‘Strange Lattice‘.

Strange Lattice
Launch
(your browser needs to support WebGL!)

I know - I know, some of you still refuse to update your system having an up to date browser that supports proper and sexy WebGL like Chrome or Firefox. All right, uploaded a video of this experiment to my vimeo account as well.

That´s it - enjoy!

!!! Edit: Updated shader for the use with Chrome Canary, but should as well execute on Chrome 12 + 13, Firefox & Safari 5.1