• Home
  • Blog
  • Simple, Low Cost Microscope Camera Streaming

Simple, Low Cost Microscope Camera Streaming

SARS-COV-2 virus is forcing teaching modalities to change. The sciences will be directly affected as much of the curriculum relies upon hands on learning.

I spent a few hours over the weekend exploring ways to inexpensively add video streaming capabilities to your existing microscope cameras. I aimed to restrict the components required to what can be commonly found in classrooms, or have utility outside of this project.

What you’ll need

You’ll need a camera, a microscope, a specimen, and a Raspberry Pi 3 or 4. It’s likely that any old PC that can run Linux, has a USB port, and a network connection will do, but I’ve tested this on a Raspberry Pi 3 and 4.

The eyepiece cameras that we have sold in recent times, (TP-105 and TP-103), worked straight away. Any cameras made by Touptek in recent times should work with minimal fuss. The Tucsen GT series will also work (tested with a GT12).

For the server, I originally tested with a Raspberry Pi 3, which we use here as a tracking cookie blocker using Pi Hole. I have also successfully tested this with a Raspberry Pi 4. You’ll need local terminal or SSH access to the Raspberry Pi.

How to see if your camera will work

Raspbian comes with Video For Linux pre-installed. You can see if your camera will work by typing:

$ v4l2-ctl --list-formats-ext

If all goes to plan, you’ll get an output like: (this is for a GT12)

ioctl: VIDIOC_ENUM_FMT
    Index       : 0
    Type        : Video Capture
    Pixel Format: 'YUYV'
    Name        : YUYV 4:2:2
        Size: Discrete 640x480
            Interval: Discrete 0.067s (15.000 fps)
        Size: Discrete 320x240
            Interval: Discrete 0.067s (15.000 fps)
Index       : 1
Type        : Video Capture
Pixel Format: 'MJPG' (compressed)
Name        : Motion-JPEG
    Size: Discrete 800x600
        Interval: Discrete 0.033s (30.000 fps)
    Size: Discrete 1024x768
        Interval: Discrete 0.033s (30.000 fps)
    Size: Discrete 1600x1200
        Interval: Discrete 0.033s (30.000 fps)
    Size: Discrete 1920x1080
        Interval: Discrete 0.033s (30.000 fps)
    Size: Discrete 2304x1728
        Interval: Discrete 0.033s (30.000 fps)
    Size: Discrete 2592x1944
        Interval: Discrete 0.200s (5.000 fps)
    Size: Discrete 4000x3000
        Interval: Discrete 0.033s (30.000 fps)

If you only get the line with “YUYV”, then your camera will likely work but with raw or uncompressed video.

The camera that returns “MJPG” is a little better as this is compressed video and should mean that the CPU has less work to do when we get to the streaming part.

Testing image capture

Another useful app that comes pre-installed is fswebcam. This will let you take a snapshot from the connected camera.

$ fswebcam image.jpg

By default, this grabs a frame, puts a banner on the bottom of the image and saves it as a file.

Bodhi Stem at 40x with TP-103 camera and M-100FLED microscope

Note that you might not have the image in focus, so set it all up with the standard software first. Also, this is the raw image: no auto exposure or white balance is applied.

Setting up a streaming server

Now we’re getting to the interesting part: setting up the streaming server. I tried Motion and MPJG-Streamer. I prefer MJPG-Streamer. To install this, run the following:

$ sudo apt-get install libjpeg-dev
$ sudo apt-get install cmake
$ git clone https://github.com/jacksonliam/mjpg-streamer
$ cd mjpg-streamer-experimental
$ make
$ sudo make install

This should download, compile, and install MJPG-Streamer.

Running the streaming server

The command to run the streaming server is

$ mjpg_streamer -i "input_uvc.so -d /dev/video0 -r 640x360 -f 10" -o "output_http.so -p 9090 -w ./www"

This creates a web server that is accessible on port 9090 of your Raspberry Pi. Note that you need to run this command in the mjpg-streamer-experimental directory.

If it worked, you’ll see the above.

Test the video stream by clicking on the “Stream” link, or set the device’s parameters using the “Control” link.

The control panel

640 x 480 seems to be the best resolution for YUV video streams. MJPG might happily produce higher resolution video. But keep in mind that this needs to now be viewed by multiple clients. Use the smallest resolution you can to maintain image quality, high frame rate, and to minimise the load on the Raspberry Pi and the network.

-d sets the device if you have multiple cameras
-r sets the capture resolution
-f sets the capture frame rate (isn’t useful to exceed the camera’s native frame rate at this resolution).

Student’s devices (Clients)

With a Raspberry Pi 3, this set up should handle around 5 – 10 clients. The Raspberry Pi 4 will handle more. I haven’t tested this well yet, so your comments below will be helpful.

Web browsers

Point student’s browsers to:

{raspberrypi}:9090/stream_simple.html 

or

{raspberrypi}:9090/stream.html

VLC

File -> Open Network -> http://{raspberrypi}:9090/?action=stream

Next steps

This document on IotAlot was most useful. It can guide you to:

  • You can set this up as a service that starts when the Raspberry Pi starts.
  • You can open the Raspberry Pi to the internet

It should therefore be possible to also:

  • You can use a DDNS to create a domain name that points to this setup.

Other considerations:

  • You might also like to see if it’s possible to add a streaming camera to your video conferencing software.
  • This method produces an HTTP connection. A RTSP or UDP connection would be better for larger numbers of simultaneous users.


All rights reserved. Information is provided in good faith, presented as accurately as possible, but errors and ommisions may exist.