This demo utilizes webGL shows an animated 2d canvas on a 3d object and has a 3d plane that displays video. This demo also takes advantage of the Audio Data Api that is currently in the Firefox 4 Beta. I have been playing around with the energizeGL and processingJS and I have found they complement each other and they can easily be utilized together. One of the many cool features found in the energizeGL is the ability to easily utilize a 2d Canvas on a 3d object as a texture. This of course makes it easy to apply Processing code to a texture and it is also easy to render video as a texture. EnergizeGL has many similarities to the Processing framework so if you are familiar with Processing you will be able to quickly pick up energizeGL. This has been a fun experiment for me to put together. New technologies are always fun. It doesn’t really have any point other than to simply show you what is possible with HTML5, WebGL, and the new Audio Data Api. So let me be clear you will not be able to view this demo if you do not have “Firefox 4 Beta“.

This demo will show you how to:

  1. Embed ProcessingJS to a texture on a 3d object in EnergizeGL
  2. Embed a video in EnergizeGL on a 3d plane.
  3. How to analyze Audio data to create a simple 3d Visualization

Only View this in “FireFox Beta“!

webGl Crystal Demo

To view the Demo Click you will need Firefox Beta

Download Code Example: Please note you will need to run this on a server due to security restrictions dealing with the Firefox browser and security. This is due to the audio api usage. So run it on a local server and you will see the audio demo if not the code will not work for you. I you want something to run on a server you might also have to enable the web video mime types on your server.

/////initialilze variables Processing this code has to be in a sperate code block ///////////////////////////////

PImage imgPic;
float y = 100;
String text1SizeDir = "up";
String text1Dir = "down";
int text1FontSize = 20;
int text1Ypos = 5;
int text2Ypos = 5;
String text1AlphaDir = "down";
float text1Alpha = 255;
int resizeText = 0;

void setup() 
{
        size(256, 256);  // Size should be the first statement
        stroke(255);     // Set line drawing color to white
        frameRate(20);
}	
void draw() 
{ 
        background(0);   // Set the background to black
		imgPic = loadImage(imageToProcessing[0]);
        resizeText++;
		image(imgPic, 0, 0);
		noFill();
		beginShape();
		curveVertex(84,  91);
		curveVertex(84,  91);
		curveVertex(68,  19);
		endShape();
		if(isEven(resizeText)){
			if (text1SizeDir == "up") {
				text1FontSize += 1;
				if (text1FontSize > 20) {
					text1SizeDir = "down"
				}
			
			} else {
				text1FontSize -= 1;
				if (text1FontSize < 15) {
					text1SizeDir = "up"
				}
			
			}
		}
        if (text1Dir == "down") {
            text1Ypos += 1;
        
        }
        if(text1Ypos == 256){
			text1Ypos = 0;
		}
        
        if (text1AlphaDir == 'up') {
            text1Alpha +=2;
            if (text1Alpha > 255) {
                text1Alpha = 255;
                text1AlphaDir = "down";
            }
        } else {
            text1Alpha -=2;
            if (text1Alpha < 0) {
                text1Alpha = 0;
                text1AlphaDir = "up";
            }
        
        }
		
		animatedFont aniFont[0] = new animatedFont("Vollkorn", text1Ypos,sc9Text[0]);
		
		if(text1Ypos > 20){
			animatedFont aniFont[1] = new animatedFont("Vollkorn", text1Ypos - 20 ,sc9Text[1]);
		}
		if(text1Ypos > 40){
			animatedFont aniFont[2] = new animatedFont("Vollkorn", text1Ypos - 40 ,sc9Text[2]);
		}
}

class animatedFont{
	string FontName; int text1Ypos; string textMessage;
	animatedFont(FontName, text1Ypos, textMessage){
		font = loadFont(FontName);
        textFont(font,text1FontSize); 
        fill(255, 255, 0,text1Alpha);
        text(textMessage,5, text1Ypos-2); 
        fill(255, 255, 255,text1Alpha) ;
        text(textMessage,5,text1Ypos);
	}
	
}
/////End Processing this code has to be in a sperate code block ///////////////////////////////


//intialize global vars
	var imageToProcessing = new Array(), c = 102, cf = 2900, af = 0, executeProcessingCode, mouseX = 0, mouseY = 0, deltaM = 0, _currentX, _xRotation, _xRot = 0, radian, degree, audio, channels, Zrate, frameBufferLength, fft, magnitude, vidCanvas, context, cw,ch,sc9Text = new Array("SeaCloud9 --> Transmission","Interactive Development","1010101010101010101010");

	
    $().mousemove( function(e) {
    mouseX = e.pageX;
    mouseY = e.pageY;
    });
	
	function radian(){
		var rad = Math.atan2(mouseY, mouseX);
		return rad;
	};
	function degree(){
		var deg = radian() * 180 / Math.PI;
		return deg;
	};
	function _xRotation(){
		_xRot += Math.cos(degree() * Math.PI / 180);
		return _xRot;
	};
	
	jQuery(function($) {
	_currentX = -40 + (deltaM * 10);
	//myXRotation = new _xRotation();
    $('body')
        .bind('mousewheel', function(event, delta) {
            var dir = delta > 0 ? 'Up' : 'Down',
                vel = Math.abs(delta);
			deltaM = delta;
			_currentX = _currentX+deltaM*15;
            return false;
        });
		
		
// Mozilla Specific Audio Code /////////////////


			audio = document.getElementById('mediaElement');
			// not really a test for Firefox 4, but should avoid error messages in browsers 
			// that don't support audio or Float32Array (as used in FFT constructor below)
			if (typeof audio.currentSrc === "undefined" || typeof Float32Array === "undefined") {
				alert("This example requires Firefox 4: \n\nwww.mozilla.com/firefox/beta");
			} else {
				
				function loadedMetadata() {
					channels = audio.mozChannels;
					rate = audio.mozSampleRate;
					frameBufferLength = audio.mozFrameBufferLength;	 
					fft = new FFT(frameBufferLength / channels, rate);
				}
				
				function audioAvailable(event) {
					var fb = event.frameBuffer,i, t = event.time,signal = new Float32Array(fb.length / channels);
					
					for (i = 0, fbl = frameBufferLength / 2; i < fbl; i++) {
						// Assuming interlaced stereo channels,
						// need to split and merge into a stero-mix mono signal
						signal[i] = (fb[2*i] + fb[2*i+1]) / 2;
					}
				
					fft.forward(signal);
					
					
					for (i = 0; i < fft.spectrum.length; i++) {
						magnitude = fft.spectrum[i] * 4000;
					}
				}
				
				audio.addEventListener('MozAudioAvailable', audioAvailable, false);
				audio.addEventListener('loadedmetadata', loadedMetadata, false);
				
				// FFT from dsp.js, see below
				var FFT = function(bufferSize, sampleRate) {
					this.bufferSize = bufferSize;
					this.sampleRate = sampleRate;
					this.spectrum = new Float32Array(bufferSize/2);
					this.real = new Float32Array(bufferSize);
					this.imag = new Float32Array(bufferSize);
					this.reverseTable = new Uint32Array(bufferSize);
					this.sinTable = new Float32Array(bufferSize);
					this.cosTable = new Float32Array(bufferSize);
				
					var limit = 1,
						bit = bufferSize >> 1,
						i;
				
					while (limit < bufferSize) {
						for (i = 0; i < limit; i++) {
							this.reverseTable[i + limit] = this.reverseTable[i] + bit;
						}
						limit = limit << 1;
						bit = bit >> 1;
					}
				
					for (i = 0; i < bufferSize; i++) {
					this.sinTable[i] = Math.sin(-Math.PI/i);
					this.cosTable[i] = Math.cos(-Math.PI/i);
					}
				}; // FFT
					
				FFT.prototype.forward = function(buffer) {
					var bufferSize = this.bufferSize,
						cosTable = this.cosTable,
						sinTable = this.sinTable,
						reverseTable = this.reverseTable,
						real = this.real,
						imag = this.imag,
						spectrum = this.spectrum;
						
					var i;
					
					if (bufferSize !== buffer.length) {
						throw "Supplied buffer is not the same size as defined FFT. FFT Size: " + bufferSize + " Buffer Size: " + buffer.length;
					}
					
					for (i = 0; i < bufferSize; i++) {
						real[i] = buffer[reverseTable[i]];
						imag[i] = 0;
					}
					
					var halfSize = 1,
						phaseShiftStepReal,	
						phaseShiftStepImag,
						currentPhaseShiftReal,
						currentPhaseShiftImag,
						off,
						tr,
						ti,
						tmpReal;
					
					while (halfSize < bufferSize) {
						phaseShiftStepReal = cosTable[halfSize];
						phaseShiftStepImag = sinTable[halfSize];
						currentPhaseShiftReal = 1.0;
						currentPhaseShiftImag = 0.0;
						
						for (var fftStep = 0; fftStep < halfSize; fftStep++) {
							i = fftStep;
							while (i < bufferSize) {
								off = i + halfSize;
								tr = (currentPhaseShiftReal * real[off]) - (currentPhaseShiftImag * imag[off]);
								ti = (currentPhaseShiftReal * imag[off]) + (currentPhaseShiftImag * real[off]);
								
								real[off] = real[i] - tr;
								imag[off] = imag[i] - ti;
								real[i] += tr;
								imag[i] += ti;
								
								i += halfSize << 1;
							}
							tmpReal = currentPhaseShiftReal;
							currentPhaseShiftReal = (tmpReal * phaseShiftStepReal) - (currentPhaseShiftImag * phaseShiftStepImag);
							currentPhaseShiftImag = (tmpReal * phaseShiftStepImag) + (currentPhaseShiftImag * phaseShiftStepReal);
						}
					
						halfSize = halfSize << 1;
					}
					
					i = bufferSize/2;
					while(i--) {
						spectrum[i] = 2 * Math.sqrt(real[i] * real[i] + imag[i] * imag[i]) / bufferSize;
					}
				}; // FFT.prototype.forward
			
			} // requires-Firefox 4 warning

	});
	
// EnergizeGL Code  /////////////////////////////////////

		function setup() {
			setBackgroundColor(0, 0);
			setAmbientColor(0.5);
			zFar(2000);
			usePointLight();
			setPointColor(1);
			setPointPosition(0, 0, 100);
			setShininess(3);
			useAlpha();
			loadTexture('skin', 'blueTxt.png');
			loadTexture('myTexture', 'blueTxt.png');
			createTexture('myCanvas2D', 256, 256);
			createTexture('vid', 512, 512);
			startLoading();
			star = new starMesh();
			star1 = new starMesh1();
			executeProcessingCode = new executeProceessing();
			crystalVid = new executeVideos();	
		}
		
		
			function starMesh(){
				startMesh('starMesh', c);
					for(var i = 0; i < c; i++) {
						addVertex(random(-200, 200), random(-200, 200), random(-200, 200));
					}
			endMesh(); 		
			}
			
			
			function starMesh1(){
				startMesh('starMesh1', c);
						for(var i = 0; i < c; i++) {
							setColor(random(0.2, 0.5), random(0.8, 1), random(0.5, 0.8));
							addVertex(random(-50, 50), random(-50, 50), random(-50, 50));
						}
				endMesh();
			}


		function draw() {
			setCamera(_currentX, 0, -100, 0, 0, 0, 1, 1, 1);
			//rotateX(_xRotation());
			rotateY(Math.sin(mouseY/50));
		   // rotateX(Math.sin(_xRotation()));
    		rotateZ(Math.sin(af/100));
			if(context != undefined){
				updateTexture('vid');
				useTexture('vid');
				rectangle(0,0,200,480,300);
				noTexture();
			}
			star2 = new starMesh1();
			updateTexture('myCanvas2D');
			useTexture('myCanvas2D');
        		cube(0, 0, 0, 512, 512, 512);
    		noTexture();		
			tween('starMesh1', 'starMesh', TRIANGLE_STRIP, 100, EGLEasing.sin(0, 100, 60 * magnitude, af), 'linear');
			af++;
			
		}

		function executeVideos(){
			if(context === null || context === undefined){
				setTimeout("executeVideos()", 5000);
			}
			var ctx = getTextureCanvas('vid');
			audio.addEventListener('play', function(){
			drawVideo(this,ctx,cw,ch);
			},false);
			updateTexture("vid");
		}
		
		function executeProceessing(){
			imageToProcessing.push("ballons.png");			
			// Get the Canvas2D this texture is made of
			var ctx = getTextureCanvas('myCanvas2D');
			var div1Element = ctx.canvas;			
			var processingCode = new Processing(div1Element, jQuery('#procssingCode').text());
			updateTexture("myCanvas2D");
		}
		
		StartEnergizeGL('appcanvas', '', 'applog');
		
		function isEven(value){
			if (value%2 == 0)
				return true;
			else
				return false;
		}


document.addEventListener('DOMContentLoaded', function(){
vidCanvas = document.getElementById('vidCanvas');
context = vidCanvas.getContext('2d');
cw = Math.floor(vidCanvas.clientWidth / 100);
ch = Math.floor(vidCanvas.clientHeight / 100);
//vidCanvas.width = cw;
//vidCanvas.height = ch;
},false);
function drawVideo(audio,c,w,h) {
    if(audio.paused || audio.ended) return false;
    c.drawImage(audio,0,-100,520,700);
    setTimeout(drawVideo,20,audio,c,520, 700);
};

I highly recommend downloading Firefox 4 Beta if you haven't yet. It is by far the best version of Firefox ever created. The Mozilla team has really been working hard on implementing all of the new HTML5 capabilities into their browser. It is also worth noting that WebKit has also added the Audio API to their browser as well; although you have to download it from a specific branch. So it really won't be too long and all the browsers will have audio and webGL not to mention CSS3. ChromeOS currently has webGL capability and with it being present on the Chrome Beta, WebKit nightly, and Firefox. I believe most of these browsers will be out of beta probably first quarter so relatively soon. The webGL specification is now near final. It is also interesting to know they are allowing for extensions to be built and this in my opinion will probably work out really well for mobile devices. I will look forward to seeing webGL on the iPhone and Android in the near future. Mobile drives html5 and with an average life span of a phone and contract being 2 years. I see the browser landscape rapidly changing. People will utilize newer browsers in less time due to the shelf lives of their phones. In short everything needs to be rewritten in the next 5 years to deal with mobile to stay valid.

If you missed it their was a webGL camp recently. I didn't have the opportunity to be there although I have seen the screen casts. The screen casts were very informative check them out here if you missed them. Google sent a couple of representatives to discuss webGL. It will be fun utilizing webGL on future projects. If your interested in learning more about webGL. I have found learningWebGL.com to be extremly helpful and is always up to date in regards to what is going on in the webGL community.

I have also been working on a couple of games to in my spare time and I am getting close to completing my first video game. I hope to have it out sometime in the new year. I am also anxious to take advantage of the Chrome Apps Store and Mozilla Firefoxs App Store as well. I am glad to see browsers supporting this functionality directly within the browser. I believe this will provide a great deal of web developers a way to directly capitalize off their skill sets. I have been utilizing Sencha Touch for the interface and I believe it is a solid javascript framework especially when it comes to mobile. To be honest I don't think I have seen so many useful controls that work well together in any other mobile framework. I also like the fact it utilizes an MVC design pattern. The Sencha Touch framework is worth looking into if you haven't yet so check it out here. On the server I recently installed NodeJS and I have it working really well on Media Temple🙂 I am looking forward to utilizing NodeJS for the backend for some of my new apps and games. I really like the idea of writing everything in one language javascript and Node seems to simplify a great deal for me. If you are not familiar with nodeJS check it out it is server-side JavaScript.