Lightdriver

Drive WS2812B LED strips from a PMOD device, a lightdriver project @ anfractuosity.com:

I created a PMOD module PCB using KiCAD, that enables connecting WS2812B lighting strips to an FPGA board with a PMOD interface. The board was assembled by JLCPCB.
This is my first project using an FPGA, I plan to soon implement an SPI interface with the FPGA, to accept colour pixels via SPI from a raspberry pi, to then drive the LEDs appropriately. I am making use of the original Zybo board which uses a Zynq FPGA, although I’m not using the ARM portion of this chip as I want to learn VHDL.

Project files are available on GitHub.

Read more »

Reflow Toaster Oven – a Qwiic Hack!

While working from home during the past few months, I have greatly missed being able to throw some prototypes through the super nice SparkFun production reflow oven.

zoom in of solder paste reflowing in a toaster oven

Ahhh, so glad I didn't have to hand solder each of those blobs.

I really wanted a reflow oven in my garage, so after doing some research online, I decided to buy a cheap toaster oven and give this hack a try. Ultimately, I decided to use a standard hobby servo to automate the movement of the temperature knob like so:

alt text

Just like a push-pull linkage in a model airplane!

Although slightly less elegant than some of the other methods out there, this servo control method works quite well. Additionally, it didn't require any modifications to the wiring of my toaster oven and was pretty darn cheap and simple to setup.

Research

I came across many blogs, websites and products dedicated to hacking toaster ovens into reflow ovens:

The most important things I learned were the following:

  1. Get a toaster oven with these features (I purchased the Black & Decker TO3000G, which was $59 new):

    • Analog dial control style
    • Quartz elements (two above and two below if you can)
    • A convection fan
    • Small or medium-sized
  2. A single thermocouple "floating" in the air near the PCB works fine. Note, a lot of the pro-end models have multiple temp sensors (at multiple points in the chamber and one touching the PCB), but I haven't found the need yet. The SparkFun Qwiic Thermocouple and braided thermocouple work great!

  3. Opening the door seems to be the only way to get a good and fast cool down. Even the ~$4,000 pro machines have the drawer open automatically to achieve good "free air" cooling. I have yet to automate this step, but so far I love watching the solder flow. When I see my reflow time is up, I have been manually opening the door. If I find the need to automate this in the future, I'll most likely get another servo or stepper motor involved.

  4. Most toaster ovens can't achieve an ideal ramp up rate of 1.5C/second, but that's okay. You can add a slower ramp up during your soak phase - a slow ramp up from 150-180. More on this later as we discuss the profile, the code and show some results.

Parts List

Manual testing

Before rigging up a servo to move the control knob, I first gathered parts and got my thermocouple readings working. Then the plan was to simply move the control knob manually (with my hand), and see if I could get a decent profile while watching the data plotter.

alt text

My very first attempt manually adjusting the knob by hand.

Following the profile suggestions in the CompuPhase article and reading what Wikipedea had, I decided to try for a profile represented by the blue line in the graphic above. The red represents the live readings from the thermocouple. Note: in all of my serial plotter graphics, the blue line is being plotted as a reference, and my controller is not analyzing the blue line in any way.

And so I tried again:

alt text

Second attempt adjusting temp knob by hand.

As you can see, I started taking notes to keep track of what was working and what was not. For my second attempt, I actually set the oven to "warm" for ten seconds before starting my plotter. The adjustments noted in the graphic were done by me physically moving the dial with my hand.

By my fourth manual test, I was pretty pleased with the results - not perfect by any means, but starting to look a bit like an actual profile:

alt text

Fourth attempt by hand, starting to look decent.

Servo Hack

Now that I knew a decent profile could be achieved by simply moving the temp dial, it was time to slap a servo on this thing and automate the movements!

First things first, I ripped off that control knob. A flat head screw driver helped greatly to get it started. Next, I mounted the servo horn to the knob with a servo mount screw in the center hole of the servo horn.

Temp knob removed. Mount with center point screw first.

Then, I drilled out two more holes for a good secure mount. Note: I'd make sure to align the positioning mark (that little silver thing) directly down. This will ensure that the servo linkage and rotation can achieve the full 180 degrees of motion.

Mounted servo horn with center screw. Horn mounting complete.

Next was actually mounting the servo to the oven. I had a scrap piece of wood that fit the job exactly, but anything that can extend the servo out past the face of the oven should work. I opted to use dual-lock hook and loop, but servo tape or double-sided foam tape would have worked too. Make sure to align properly both along the center point of the servo horns and the depth away from the oven.

Center point alignment. Depth alignement.

Finally, the last step was to link the two control horns. There are many ways to do this with various linkage hardware, but the simplest is to use bare control rods and add "Z-bends" at specific points. I first tried this with a single control rod, but it was much smoother with two!

Z-bend Dual control rod hooked up

The Code

The source code for the project can be downloaded from the GitHub repository here:

Using Example 1 from the Qwiic Thermocouple Arduino Library and a basic servo control example, I was able to piece together a sketch that accomplishes a pretty nice reflow profile.

Before diving into the code, I'd like to highlight the fact that this project does not use PID control to try and follow a set profile. I first proved out that manually moving the temp knob by hand could create a decent reflow profile. By watching the temperature and keeping track of time, my Arduino can determine the zone (preheat, soak, ramp up, reflow, cooldown). Once you know the zone, you can simply set servo to predetermined positions (or toggle on/off as in soak) to hold a close enough temperature or ramp up.

My plan was:

  1. Set full power
  2. Wait to see 140C, kill power, enter soak zone
  3. During soak, toggle power on/off to climb from 150C to 180C
  4. Once soak time is complete, turn on full power (enter ramp-up zone)
  5. During ramp-up, wait to see reflow temp (220C) (enter reflow zone)
  6. During reflow, toggle servo on/off to hold temp
  7. Once reflow time is complete, turn servo to "off," indicating user should open door

My main loop simply uses a "for loop" to get through the zones, then when it's done it stops with a "while(1)". It is updateServoPos() that actually determines the zone and updates a global variable.

void loop() 

for (int i = 0 ; i <= 4 ; i++) // cycle through all zones - 0 through 4.

    int tempDelta = abs(profileTemp[i + 1] - profileTemp[i]);
    int timeDelta = abs(profileTimeStamp[i + 1] - profileTimeStamp[i]);
    float rampIncrementVal = float(tempDelta) / float(timeDelta);
    currentTargetTemp = profileTemp[i];

    // Print data to serial for good plotter view in real time.
    // Print target (for visual reference only), currentTemp, ovenSetting (aka servo pos), zone
    for (int j = 0 ; j < (timeDelta - 1); j++)
    
    currentTargetTemp += rampIncrementVal;
    Serial.print(currentTargetTemp, 2);
    Serial.print(",");

    if (tempSensor.available())
    
        currentTemp = tempSensor.getThermocoupleTemp();
    
    else 
        currentTemp = 0; // error
    

    Serial.print(currentTemp);
    Serial.print(",");

    updateServoPos(); // Note, this will increment the zone when specific times or temps are reached.

    Serial.print(ovenSetting);
    Serial.print(",");

    Serial.println(zone * 20);
    delay(1000);
    totalTime++;
    


// monitor actual temps during cooldown
for (int i = 0 ; i < COOL_DOWN_SECONDS ; i++)

    currentTargetTemp = 0;
    Serial.print(currentTargetTemp, 2);
    Serial.print(",");
    if (tempSensor.available())
    
    currentTemp = tempSensor.getThermocoupleTemp();
    
    else 
    currentTemp = 0; // error
    
    Serial.print(currentTemp);
    Serial.print(",");
    updateServoPos();
    Serial.print(ovenSetting);
    Serial.print(",");
    Serial.println(zone * 20);
    delay(1000);

while (1); // end

The following function, updateServoPos(), is where the temperature and totalTime is checked to determine the zone we are currently in, and ultimately what servo positions are desired.

void updateServoPos()

    if ((zone == ZONE_preheat) && (currentTemp > 140)) // done with preheat, move onto soak
    
        zone = ZONE_soak;
    
    else if ((zone == ZONE_soak) && (soakSeconds > SOAK_TIME)) // done with soak, move onto ramp
    
        zone = ZONE_ramp;
    
    else if ((zone == ZONE_ramp) && (currentTemp > 215)) // done with ramp move onto reflow
    
        zone = ZONE_reflow;
    
    else if ((zone == ZONE_reflow) && (reflowSeconds > 30))
    
        zone = ZONE_cool;
    

switch (zone) 
    case ZONE_preheat:
        setServo(41);
        break;
    case ZONE_soak:
        soakSeconds++;
        if ((soakSeconds % 15) == 0) // every 15 seconds toggle
        
            soakState = !soakState;
        
        if (soakState)
        
            setServo(100);
        
        else 
            setServo(0);
        
        break;
    case ZONE_ramp:
        setServo(100);
        break;
    case ZONE_reflow:
        reflowSeconds++;
        if ((reflowSeconds > 5) && (reflowSeconds < 10)) // from 5-10 seconds drop to off
        
            setServo(0);
        
        else 
            setServo(100);
        
        break;
    case ZONE_cool:
        setServo(0);
        break;
    

Some of these settings would surely need to be tweaked when using a different oven, thermocouple and servo setup. Slight variations in the servo hardware hookups will most likely require different servo positions settings to get the desired position on the temp knob.

Also, the position of the thermocouple can affect its readings, so each setup will most likely require different temperature trigger values (i.e. in the servoUpdatePos() function, you may need to adjust the if statements that determine the zone).

Last, paste and component size must be considered. After reflowing a few panels, I saw that the SAC305 paste I was using actually started reflowing closer to 220C, so I made that my target max reflow temp. And if I had huge components, these can act as heatsinks and may require a longer soak and/or higher peak reflow temp.

Results

alt text

My last profile cooking actual panels and looking good!

The above screenshot is from my latest achieved profile using servo control. The different colored lines represent the following:

  • Blue: Generic reference profile
  • Red: Temp readings showing actual profile in the oven
  • Yellow: Zones, stepping up through preheat, soak, ramp-up, reflow, cooldown
  • Green: Servo position, represented by 0-100 percent power

Although this last profile screenshot is the result of almost 20 test runs and 10 actual board panels reflowing, I would like to mention that even with some slight variation between profiles, all of my panels came out beautifully. A good paste job and proper placement of parts can greatly help your initial yield, but I even had a few sloppy paste jobs and they came out fine!

Conclusion

My complete setup in the garage. Panel in the oven.

If you're interested in reflowing some electronics using a toaster oven, it can be done using a simple Arduino, servo and thermocouple. It does take a little time to do some test runs and make slight adjustments to the provided Arduino sketch, but the results are pretty great! I currently have to watch the oven to manually open the door when it's done, but this could be automated. Plus, while one panel was in the oven, I simply started hand placing parts on the next panel and so no time was really lost.

If you would like to give this a try and have any questions, please reach out in the comment below or at the GitHub project repository.

Also, if you have any experience with solder reflow profiles, I'd love to hear your thoughts in the comments below.

Thanks for reading and cheers to some good DIY electronics SMD assembly!

Resources

comments | comment feed

Read more »

A full-duplex tiny AVR software UART

Nerd Ralph writes:

I’ve written a few software UARTs for AVR MCUs. All of them have bit-banged the output, using cycle-counted assembler busy loops to time the output of each bit. The code requires interrupts to be disabled to ensure accurate timing between bits. This makes it impossible to receive data at the same time as it is being transmitted, and therefore the bit-banged implementations have been half-duplex. By using the waveform generator of the timer/counter in many AVR MCUs, I’ve found a way to implement a full-duplex UART, which can simultaneously send and receive at up to 115kbps when the MCU is clocked at 8Mhz.

More details on Nerd Ralph blog.

Read more »

Simple machine learning with Arduino KNN

Machine learning (ML) algorithms come in all shapes and sizes, each with their own trade-offs. We continue our exploration of TinyML on Arduino with a look at the Arduino KNN library.

In addition to powerful deep learning frameworks like TensorFlow for Arduino, there are also classical ML approaches suitable for smaller data sets on embedded devices that are useful and easy to understand — one of the simplest is KNN.

One advantage of KNN is once the Arduino has some example data it is instantly ready to classify! We’ve released a new Arduino library so you can include KNN in your sketches quickly and easily, with no off-device training or additional tools required. 

In this article, we’ll take a look at KNN using the color classifier example. We’ve shown the same application with deep learning before — KNN is a faster and lighter weight approach by comparison, but won’t scale as well to larger more complex datasets. 

Color classification example sketch

In this tutorial, we’ll run through how to classify objects by color using the Arduino_KNN library on the Arduino Nano 33 BLE Sense.

To set up, you will need the following:

  • Arduino Nano 33 BLE Sense board
  • Micro USB cable
  • Open the Arduino IDE or Arduino Create
  • Install the Arduino_KNN library 
  • Select ColorClassifier from File > Examples > Arduino_KNN 
  • Compile this sketch and upload to your Arduino board

The Arduino_KNN library

The example sketch makes use of the Arduino_KNN library.  The library provides a simple interface to make use of KNN in your own sketches:

#include <Arduino_KNN.h>

// Create a new KNNClassifier
KNNClassifier myKNN(INPUTS);

In our example INPUTS=3 – for the red, green and blue values from the color sensor.

Sampling object colors

When you open the Serial Monitor you should see the following message:

Arduino KNN color classifier
Show me an example Apple

The Arduino board is ready to sample an object color. If you don’t have an Apple, Pear and Orange to hand you might want to edit the sketch to put different labels in. Keep in mind that the color sensor works best in a well lit room on matte, non-shiny objects and each class needs to have distinct colors! (The color sensor isn’t ideal to distinguish between an orange and a tangerine — but it could detect how ripe an orange is. If you want to classify objects by shape you can always use a camera.)

When you put the Arduino board close to the object it samples the color and adds it to the KNN examples along with a number labelling the class the object belongs to (i.e. numbers 0,1 or 2 representing Apple, Orange or Pear). ML techniques where you provide labelled example data are also called supervised learning.

The code in the sketch to add the example data to the KNN function is as follows:

readColor(color);

// Add example color to the KNN model
myKNN.addExample(color, currentClass);

The red, green and blue levels of the color sample are also output over serial:

The sketch takes 30 color samples for each object class. You can show it one object and it will sample the color 30 times — you don’t need 30 apples for this tutorial! (Although a broader dataset would make the model more generalized.)

Classification

With the example samples acquired the sketch will now ask to guess your object! The example reads the color sensor using the same function as it uses when it acquired training data — only this time it calls the classify function which will guess an object class when you show it a color:

 readColor(color);

 // Classify the object
 classification = myKNN.classify(color, K);

You can try showing it an object and see how it does:

Let me guess your object
0.44,0.28,0.28
You showed me an Apple

Note: It will not be 100% accurate especially if the surface of the object varies or the lighting conditions change. You can experiment with different numbers of examples, values for k and different objects and environments to see how this affects results.

How does KNN work?

Although the  Arduino_KNN library does the math for you it’s useful to understand how ML algorithms work when choosing one for your application. In a nutshell, the KNN algorithm classifies objects by comparing how close they are to previously seen examples. Here’s an example chart with average daily temperature and humidity data points. Each example is labelled with a season:

To classify a new object (the “?” on the chart) the KNN classifier looks for the most similar previous example(s) it has seen.  As there are two inputs in our example the algorithm does this by calculating the distance between the new object and each of the previous examples. You can see the closest example above is labelled “Winter”.

The k in KNN is just the number of closest examples the algorithm considers. With k=3 it counts the three closest examples. In the chart above the algorithm would give two votes for Spring and one for Winter — so the result would change to Spring. 

One disadvantage of KNN is the larger the amount of training example data there is, the longer the KNN algorithm needs to spend checking each time it classifies an object. This makes KNN less feasible for large datasets and is a major difference between KNN and a deep learning based approach. 

Classifying objects by color

In our color classifier example there are three inputs from the color sensor. The example colors from each object can be thought of as points in three dimensional space positioned on red, green and blue axes. As usual the KNN algorithm guesses objects by checking how close the inputs are to previously seen examples, but because there are three inputs this time it has to calculate the distances in three dimensional space. The more dimensions the data has the more work it is to compute the classification result.

Further thoughts

This is just a quick taste of what’s possible with KNN. You’ll find an example for board orientation in the library examples, as well as a simple example for you to build on. You can use any sensor on the BLE Sense board as an input, and even combine KNN with other ML techniques.

Of course there are other machine learning resources available for Arduino include TensorFlow Lite tutorials as well as support from professional tools such as Edge Impulse and Qeexo. We’ll be inviting more experts to explore machine learning on Arduino more in the coming weeks.

Read more »

Keep an OpenMV Mind

Welcome, welcome, welcome! We have a load of new products to showcase today and it all starts with five new supporting products for the popular OpenMV H7 Camera, including WiFi and LCD shields, and three unique lens and module options. One of those options includes the ability to add a FLIR Lepton module, so we are now offering the sensor on its own! Rounding out the day we also have a new version of the :MOVE mini buggy for micro:bit. Now, let's take a closer look!

Customize your OpenMV H7 Cam!

OpenMV WiFi Shield

added to your cart!

OpenMV WiFi Shield

Out of stock WRL-16776

The WiFi Shield gives your OpenMV Cam the ability to connect to the Internet wirelessly and is perfect for streaming video fr…

$30.00

The WiFi Shield gives your OpenMV Cam the ability to connect to the Internet wirelessly. This shield features an ATWINC1500 FCC Certified WiFi module that can transmit data at up to 48 Mbps, making it perfect for streaming video from the OpenMV Camera. Your OpenMV Cam's firmware already has built-in support for controlling the WiFi Shield using the network module.


OpenMV LCD Shield

added to your cart!

OpenMV LCD Shield

18 available LCD-16777

The LCD Shield gives your OpenMV Camera the ability to display what it sees on-the-go while not connected to your computer.

$20.00

The LCD Shield gives your OpenMV Camera the ability to display what it sees on-the-go while not connected to your computer. This shield features a 1.8" 128x160 16-bpp (RGB565) TFT LCD display with a controllable backlight. Your OpenMV Cam's firmware already has built-in support for controlling the LCD Shield using the LCD module.


OpenMV Global Shutter Module

added to your cart!

OpenMV Global Shutter Module

Out of stock SEN-16775

The Global Shutter Camera Module allows your OpenMV Cam to capture high quality grayscale image snapshots not affected by mot…

$50.00

The Global Shutter Camera Module allows your OpenMV Cam to capture high quality grayscale images not affected by motion blur. The module features the MT9V034 Global Shutter Camera Module, capable of taking snapshot pictures on demand and able to run 80 FPS in QVGA mode, 200 FPS in QQVGA mode, and 400 FPS in QQQVGA mode.


OpenMV FLIR Lepton Adapter Module

added to your cart!

OpenMV FLIR Lepton Adapter Module

Out of stock DEV-16779

The FLIR® Lepton® Adapter Module allows your OpenMV Camera to interface with the FLIR Lepton 1/2/3 Thermal Imaging sensors …

$15.00
FLIR Lepton 2.5 - Thermal Imaging Module

added to your cart!

FLIR Lepton 2.5 - Thermal Imaging Module

Only 13 left! SEN-16465

The radiometric Lepton has 80x60 active pixels to capture accurate, calibrated, and non-contact temperature data in every pi…

$195.95

The FLIR® Lepton® Adapter Module allows your OpenMV Camera to interface with the FLIR® Lepton® (version 1, 2 or 3) thermal imaging sensors for thermal vision applications. Combining machine vision with thermal imaging allows you to better pinpoint or identify objects you wish to to measure the temperature of with astounding accuracy.

In order to help support this module, we are now offering the FLIR Lepton 2.5 Thermal Imaging Module on its own as well. Please be aware that we currently have a limit of one module per order.


OpenMV Ultra Wide Angle Lens

added to your cart!

OpenMV Ultra Wide Angle Lens

29 available SEN-16778

The OpenMV Ultra Wide Angle Lens increases your OpenMV Camera the ability to see a 100° field-of-view (FOV).

$15.00

The OpenMV Ultra Wide Angle Lens gives your OpenMV Camera the ability to see a wider field of view (FOV). This lens can easily be screwed into your existing module and has about a 100° FOV. The standard lens that ships with your OpenMV Camera has about a 70° FOV, which is good but not ideal for security applications.


Kitronik :MOVE mini MK2 buggy kit

added to your cart!

Kitronik :MOVE mini MK2 buggy kit

Only 5 left! ROB-16787

The :MOVE mini MK2 Buggy Kit from Kitronik provides a fun introduction to robotics using the micro:bit.

$29.95

The Kitronik :MOVE mini MK 2 buggy kit for the BBC micro:bit is the latest version of the ever popular :MOVE mini that provides a fun introduction to robotics. The :MOVE mini is a two-wheeled robot suitable for autonomous operation, remote control projects via a Bluetooth application, or being controlled using a second BBC micro:bit via the micro:bit's radio functionality.


That's it for this week! As always, we can't wait to see what you make! Shoot us a tweet @sparkfun, or let us know on Instagram or Facebook. We’d love to see what projects you’ve made!

Never miss a new product!

hbspt.forms.create( portalId: "2224003", formId: "e963ae12-71f6-46d7-bb00-126dbfef8636" );

comments | comment feed

Read more »

All the OpenMV in One Place

Last week we released five new products to support one of our most popular machine vision sensors, the OpenMV H7 Camera!

OpenMV H7 Camera

added to your cart!

OpenMV H7 Camera

In stock SEN-15325

The OpenMV H7 Camera is a small, low power, microcontroller board which allows you to easily implement applications using mac…

$65.00
1

The OpenMV H7 Camera is a small, low power microcontroller board that allows you to easily implement applications using machine vision in the real world. Out of the box, it comes loaded with the MicroPython interpreter, so you don't need to load anything to get it up and running!

To support the the OpenMV H7 Camera, we carry shields (one supporting WiFi capabilities and the other to give the sensor an LCD), a Global Shutter Module, a FLIR Lepton Module, and an Ultra Wide Angle Lens.

OpenMV LCD Shield

added to your cart!

OpenMV LCD Shield

Only 7 left! LCD-16777

The LCD Shield gives your OpenMV Camera the ability to display what it sees on-the-go while not connected to your computer.

$20.00
OpenMV WiFi Shield

added to your cart!

OpenMV WiFi Shield

Only 6 left! WRL-16776

The WiFi Shield gives your OpenMV Cam the ability to connect to the Internet wirelessly and is perfect for streaming video fr…

$30.00
OpenMV FLIR Lepton Adapter Module

added to your cart!

OpenMV FLIR Lepton Adapter Module

Out of stock DEV-16779

The FLIR® Lepton® Adapter Module allows your OpenMV Camera to interface with the FLIR Lepton 1/2/3 Thermal Imaging sensors …

$15.00
OpenMV Global Shutter Module

added to your cart!

OpenMV Global Shutter Module

Only 3 left! SEN-16775

The Global Shutter Camera Module allows your OpenMV Cam to capture high quality grayscale image snapshots not affected by mot…

$50.00
OpenMV Ultra Wide Angle Lens

added to your cart!

OpenMV Ultra Wide Angle Lens

Only 5 left! SEN-16778

The OpenMV Ultra Wide Angle Lens increases your OpenMV Camera the ability to see a 100° field-of-view (FOV).

$15.00

We've pulled everything together into a one-stop location for all the information, products, projects and videos we have for the OpenMV. This page also provides you with a diagram of the OpenMV H7 Camera, so you can see all of the key features of the sensor in a single glance. You can get to this new page by clicking the link above or the button below.

Over the next few weeks and months, expect more content from us to be added to this page, including an updated video for the OpenMV H7 Camera, more projects, and tutorials. If you would like to stay up to date on all the new products we release each week, make sure to sign up for our newsletter below!

Never miss a new product!

hbspt.forms.create( portalId: "2224003", formId: "e963ae12-71f6-46d7-bb00-126dbfef8636" );

comments | comment feed

Read more »

Learning with Raspberry Pi — robotics, a Master’s degree, and beyond

Meet Callum Fawcett, who shares his journey from tinkering with the first Raspberry Pi while he was at school, to a Master’s degree in computer science and a real-life job in programming. We also get to see some of the awesome projects he’s made along the way.


I first decided to get a Raspberry Pi at the age of 14. I had already started programming a little bit before and found that I really enjoyed the language Python. At the time the first Raspberry Pi came out, my History teacher told us about them and how they would be a great device to use to learn programming. I decided to ask for one to help me learn more. I didn’t really know what I would use it for or how it would even work, but after a little bit of help at the start, I quickly began making small programs in Python. I remember some of my first programs being very simple dictionary-type programs in which I would match English words to German to help with my German homework.

Learning Linux, C++, and Python

Most of my learning was done through two sources. I learnt Linux and how the terminal worked using online resources such as Stack Overflow. I would have a problem that I needed to solve, look up solutions online, and try out commands that I found. This was perhaps the hardest part of learning how to use a Raspberry Pi, as it was something I had never done before, but it really helped me in later years when I would use Linux more than Windows. For learning programming, I preferred to use books. I had a book for C++ and a book for Python that I would work through. These were game-based books, so many of the fun projects that I did were simple text-based games where you typed in responses to questions.

A family robotics project

The first robot Callum made using a Raspberry Pi

By far the coolest project I did with the Raspberry Pi was to build a small robot (shown above). This was a joint project between myself and my dad. He sorted out the electronics and I programmed the robot. It was a great opportunity to learn about robotics and refine my programming skills. By the end, the robot was capable of moving around by itself, driving into objects, and then reversing and trying a new direction. It was almost like an unintelligent Roomba that couldn’t hoover, but I spent many hours improving small bits and pieces to make it as easy to use as possible. My one wish that I never managed to achieve with my robot was allowing it to map out its surroundings. This was a very ambitious project at the time, since I was still quite inexperienced in programming. The biggest problem with this was calibrating the robot’s turning circle, which was never consistent so it was very hard to have the robot know where in the room it was.

Sense HAT maze game

Another fun project that I worked on used the Sense HAT developed for the Astro Pi computers for use on the International Space Station. Using this, I was able to make a memory maze game (shown below), in which a player is shown a maze for several seconds and then has to navigate that maze from memory by shaking the device. This was my first introduction to using more interactive types of input, and this eventually led to my final-year project, which used these interesting interactions to develop another way of teaching.

Learning programming without formal lessons

I have now just finished my Master’s degree in computer science at the University of Bristol. Before going to university, I had no experience of being taught programming in a formal environment. It was not a taught subject at my secondary school or sixth form. I wanted to get more people at my school interested in this area of study though, which I did by running a coding club for people. I would help others debug their code and discuss interesting problems with them. The reason that I chose to study computer science is largely because of my experiences with Raspberry Pi and other programming I did in my own time during my teenage years. I likely would have studied history if it weren’t for the programming I had done by myself making robots and other games.

Raspberry Pi has continued to play a part in my degree and extra-curricular activities; I used them in two large projects during my time at university and used a similar device in my final project. My robot experience also helped me to enter my university’s ‘Robot Wars’ competition which, though we never won, was a lot of fun.

A tool for learning and a device for industry

Having a Raspberry Pi is always useful during a hackathon, because it’s such a versatile component. Tech like Raspberry Pi will always be useful for beginners to learn the basics of programming and electronics, but these computers are also becoming more and more useful for people with more experience to make fun and useful projects. I could see tech like Raspberry Pi being used in the future to help quickly prototype many types of electronic devices and, as they become more powerful, even being used as an affordable way of controlling many types of robots, which will become more common in the future.

Our guest blogger Callum

Now I am going on to work on programming robot control systems at Ocado Technology. My experiences of robot building during my years before university played a large part in this decision. Already, robots are becoming a huge part of society, and I think they are only going to become more prominent in the future. Automation through robots and artificial intelligence will become one of the most important tools for humanity during the 21st century, and I look forward to being a part of that process. If it weren’t for learning through Raspberry Pi, I certainly wouldn’t be in this position.

Cheers for your story, Callum! Has tinkering with our tiny computer inspired your educational or professional choices? Let us know in the comments below. 

The post Learning with Raspberry Pi — robotics, a Master’s degree, and beyond appeared first on Raspberry Pi.

Read more »

Kaleidoscopic space art made with Raspberry Pi onboard the ISS

What could be the world’s first interactive art experiment in space is powered by Raspberry Pi!

The experiment, named Pulse/Hydra 3, features a kaleidoscope (as seen in the video) that lights up and starts to rotate after it receives heartbeat data from its ground terminal. This artistic experiment is designed to inspire people back on Earth.

Look closely at the video and you should be able to see small beads floating around in microgravity.

During scheduled events at museum and galleries, participants use a specially designed terminal fitted with a pulse oximeter to measure their pulse rate and blood oxygenation level. These measurements are transmitted in real time to the Pulse/Hydra 3 payload on the ISS, which is activated by the transmission.

Inside the payload, there’s a specially designed ‘microgravity kaleidoscope’. The transmitted data activates the kaleidoscope, and the resulting live images are securely streamed back to the ground terminal. The images are then projected onto large video screens so the whole audience can watch what is happening in orbit. The artistic idea is that both pulse rate and blood oxygenation levels are highly transient physiological characteristics that respond rapidly to conscious and sub-conscious emotional states. Therefore, there is a complex interaction between the participant and the payload, as both react to each other during the experience.

We wouldn’t have been able to achieve things like that on dial-up internet.

Where does it live?

Pulse/Hydra 3 is currently installed aboard the International Space Station (ISS) in the ESA Columbus module. The Columbus laboratory is ESA’s biggest single contribution to the ISS. The 4.5 m diameter cylindrical module of 6.9 m in length is equipped with flexible research facilities and provides accommodation for experiments in the field of multidisciplinary research into material science, fluid physics, and life science.

Artist's cut-away view of the Columbus module elements (image credit: ESA)

Artist’s cut-away view of the Columbus module elements (image credit: ESA)

This payload was launched on 29 June 2018 and it will be completing its two years in orbit soon.

More Raspberry Pi experiments in space

Pulse/Hydra 3 is, you guessed it, the third in a series of experiments run on board the Columbus module. The other two are:

  • Hydra-1, a plant growth experiment.
  • Hydra-2, a methanogenesis experiment exploring gravity’s effect on bacteria.

And Hydra-3 is the interactive art payload you’ve just read about. It lives in the same rack that used to house Hydra-1 and -2. All three run on Raspberry Pi!

Hydra-1, Hydra-2, and Hydra-3, all running on Raspberry Pi

These three payloads are of course great companions to our Astro Pi computers, which allow thousands of young people every year to run their code in space!

Place your bets on the year the first Raspberry Pi shop opens on the Moon…

The post Kaleidoscopic space art made with Raspberry Pi onboard the ISS appeared first on Raspberry Pi.

Read more »

Feb 15, 2013 – Adafruit interviews Barack Obama, President of the United States #ObamaDayUSA

President Obama & Limor Fried “Ladyada”:
-Proposes patent reform 0:50
-Asks the President if his daughters are considering careers in science and engineering 3:35
-Proposes each high schooler learns a computer programming language 5:12


Img 3980

Related

Read more »