I fairly recently acquired a tactile image printer, sometimes known as a swell paper machine. Essentially, you print or draw onto a specific type of paper, then run it through a machine which heats it up. Darker lines will then be raised, creating a tactile image.
Blind people experience so much image poverty. That is, we lack the same access to visual information as sighted people. Information is important, it helps us to understand the world we live in and build conceptual knowledge, and I feel the lack of it in my own life. So the prospect of being able to create my own tactile images has been an exciting one. I’m also in the process of setting up a braille embosser, which can also emboss basic graphics.
The actual process of creating these images started off pretty terribly. I didn’t expect much else. I’m new to this, not only the whole idea of having access to images, but using the technology behind it.
I’m going to document my experiments and their results, so that they might be of use to other blind people who are stumbling their way through.
Image generation
I thought the problems might have started with generating an image. I used an AI to create some basic line drawings, giving it specific prompts such as the image being black on white, and the line drawing being minimalistic.
I then used a standard printer to print the drawing onto the swell paper. Actually setting up the printer is a whole other story, but I’m sure that’s not what readers are most interested in.
Once I printed off the image I ran the swell paper through the machine.
The results
The first thing I noticed was that the raised image sits on a large rectangle. This must be visible visually as it would not have raised up if it was not also shaded. This first line drawing, of a butterfly, wasn’t particularly clear for me to feel. I think the fact that the rectangle behind it has also raised up isn’t helpful.
I had the same issue with the next image, which is of a sword, though I was able to trace the outline with my fingers. It’s just not anywhere near as clear as I know it could be.
The second try
I wondered if the issue was that I was printing from my phone, and in the printer’s app, I have to specifically select that it’s a photo and not a document. So I connected the printer to my PC and tried again.
Unfortunately the results were fairly similar. At first, the butterfly was actually a bit easier to feel and so I ran it through the machine again, hoping this would raise the lines further. Sadly it had the opposite affect. I assumed this was because the heat settings aren’t at the ideal spot, which I reasoned I could work on.
I wondered if the large background rectangle had more to do with how I generated it, using the AI, than it did to do with the printer or anything else.
I knew of software specifically designed to prepare images for embossing or printing onto swell paper, however, as with most blindness-related software, it costs hundreds of pounds. The fact that the images didn’t turn out how I would like was frustrating, but I was more disappointed as the cost of the swell paper is so high. These are expensive errors to make. I did once watch Chancey Fleet from the New York public library go through this whole process, and I believe she did something to remove the background first, which clearly I did not.
This got me thinking. Perhaps she used the tactile image software which I don’t have. But was there another alternative? If so, maybe creating something I could truly feel wasn’t completely out of reach.
Third time lucky
I decided that at this point, I may as well give it another try. I opened the image in the standard photos app on windows and looked at the edit options. There was a whole section dedicated to the background, including a simple button to remove it. So I hit the remove background button, saved the image, and started the process again.
The results were like night and day. Not only had the background been removed, but as I printed the image from the photos app on my PC, it seemed to fill the paper a lot better. The lines were clear and easy to feel, and there was no background in the way. I could run my fingers over the outline of the butterfly and easily identify every part of it. All by myself, with only my limited AI prompting skills and a bit of persistence, I’d created a tactile image to touch.
There are massive problems with AI art that I won’t even begin to cover here. Many artists are quite rightly angry with how it is being used. However, I was pleased that I was able to create something that will enable me to get a little more access to visual content.
Conclusion
I started this post by briefly mentioning image poverty. To print something off, all a sighted person needs is a printer, which you can get at all kinds of price points. You can often print for free in places like libraries. Not to mention being able to just look at the image on the screen of your phone. As I think this post demonstrates, for blind people to have access to this kind of content, the process is much more complicated and expensive. I’m lucky that I was given second-hand equipment to use, but that really is luck. Most blind people will never have access to this.
The cost of all of this is going to stop me from getting my hands on anything and everything. However, I’m still really looking forward to what I will be able to create in future. There are so many things that I have no idea about. So many iconic images that I have no concept of. Now, I might be able to.
Ironically, this post contains no images because although I did take photos of the output, when actually uploading them and adding them to this post they did not display very well. I guess you’ll just have to use your imagination.
Discover more from Catch These Words
Subscribe to get the latest posts sent to your email.
omg as a fellow blind person this sounds fascinating! i had no clue about swell paper or such a printer like this would exist. though it’d be hella expensive esp where i live, reading this and following you on twitter is fun!