Imagining a World Full of New Possibilities

The handheld Transferscope uses AI to transform real-world images into creative renderings, providing new perspectives that aid creativity.

Nick Bild
9 months agoMachine Learning & AI
The Transferscope uses generative AI to reimagine the world (📷: Christopher Pietsch)

Have you ever wished that you could view the world from a completely different perspective? Maybe by breaking out of the box of conventional thinking and immersing yourself in a new reality, you might be able to spark your creativity. Doing so might help one in solving a challenging problem, designing a new product, or even in creating a beautiful work of art.

A creative artificial intelligence (AI) researcher named Christopher Pietsch recently created an experimental device that is designed specifically for this purpose. Called the Transferscope, this handheld device leverages the power of modern generative AI algorithms to dramatically transform the world around us. Or more accurately, it can transform a representation of the real world on its built-in display screen.

Looking something like a tricorder from Star Trek, the Transferscope is built into a custom-designed, 3D-printed case. The build is intentionally minimalistic, so as to not be a distraction in and of itself. It is held in the hand with a display facing the user, and there is a single button that serves to create an imaginative, artistic rendering of the scene in front of the device.

Packed inside the case, there is a Raspberry Pi Zero 2 single-board computer for onboard processing tasks. The Raspberry Pi drives a 720 x 720 pixel display, and also captures images with an IMX519 image sensor. A pair of 18650 lithium-ion batteries power the Transferscope.

The Raspberry Pi runs a custom Python script that leverages libraries like PyGame and OpenCV for a number of image processing tasks, like edge detection. But to make the real magic happen, the preprocessed images are transferred to a nearby workstation with an NVIDIA GeForce RTX 4080 GPU and an Intel i7-12700K CPU. There, Microsoft’s Kosmos 2 multimodal large language model that can interpret images examines the captured photos. Then, with the help of Stable Diffusion. IPAdapter, and ControlNet, an entirely new image is synthesized. The new representation of the scene will be transformed with different textures and patterns to give the user a totally new perspective. Due to the architecture of the system, images can be generated in less than a second, which keeps the creative process flowing.

Pietsch hopes that the Transferscope will become an important way that people interact with the world. The design of the system may help make that a reality. It relies on open source tools and a small number of off-the-shelf components, making the Transferscope relatively simple to reproduce. If you would like to look at the world with a fresh set of eyes, make sure you check out the full write-up of the project. You might even have all of the parts you need to build your own version of the device already in your spare parts drawer.

Nick Bild
R&D, creativity, and building the next big thing you never knew you wanted are my specialties.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles