joining us now is the lightmatter ceo.u are so in demand at this particular moment of generative ai is it is really energy inefficient to basically train a big ai largely which model. you are thinking your work for years has basically been done using light as a science, as a technology to be able to make it more efficient. can you strip down how that happens? nick: absolutely. today when you look at ai models like chatgpt and project gemini, these systems are built on 10,000 node, 60,000 node supercomputers, and the chips that comprise these massive supercomputers building ai models are burning an enormous amount of energy, so what we are doing at lightmatter's we are figuring out how to scale to 100,000 nodes, how to drive the energy down. just to give you a reference today, computer chips have the same energy deputy as a nuclear reactor, and between the time when i founded the company and today, they have actually increased by 5x the amount of package for power. it is a tough trajectory and if we want to get bigger and b