Student project proposals

Student project proposals

Projects may be conducted in groups or individually, and may be modified to suit the students interest, skill and time.

RobotAll projects here are aimed at developing the robot body of the Cyborg, built on top of the fully autonomous Pioneer LX base. The robot will roam the hallways of Glassgården, as a campus maskot displaying biological neural network activity through its LED head. The biological activity is obtained via a 60-electrode neural recording device: the MEA2100. We hope through these projects to finalize our robot. The current version looks like the image on the right.

Robot Operating System (ROS)

This project utilizes ROS (Robot Operating System) and will be relevant in all the projects below. ROS is one of the most widely used robotic development frameworks, with a very large online community. Learning ROS is great to have in your toolset and CV. If you know C++, Python or Lisp, you should be up and running with ROS in short time.

ROS-based system architecture

see the overview on our Confluence wiki


Projects

1. Top-level commander node and GUI

This projects entails working on the ROS commander node and a web-based GUI (see Command center in diagram above). The GUI communicates with the commander node. The commander node is the top-lever "commander" of all other nodes in the ROS system. The commander collects system information which it sends to the GUI, and it also accepts instructions from the GUI such as setting operating mode (behaviour, demo etc.), navigate to point, and more. In detail the projects involves:

  • Continue the development of the robots web-based GUI
    • Login for users sending operation requests to robot
    • No login for displaying basic status and location on map
    • Web page fits to screen on phone and tablet
  • Continue development of the commander ROS node running in the robot
  • Manage the ROS architecture on the robot

Some desired GUI features (not all need to be implemented):

  • Monitoring
    • View operating state: state-machine, manual operation, stopped...'
    • View status information: battery charge, motor status... (optional: plot historical data)
    • View where the robot is in the map
  • Commands
    • Ability to change the operating mode
    • Ability to change state in the state-machine
    • Send robot to location
  • Manual operation:
    • Ability to view live video feed
    • Ability to control the robot using keyboard input
    • Use onboard microphone and speakers for two-way communication

Keywords: GUI, ROS, web app, database, Azure

2. Behaviour module

When the autonomous behavior mode (see behaviour in diagram above) is set by the commander node, the robot starts walking around, controlling its LEDs and playing sounds according to sensory inputs, time and a built-in emotional system. The goal of this project is to work on this autonomus mode, and to give the robot personality.  The project entails:

  • Working on the autonmous selection of states e.g. using an emotional system or behaviour trees.
  • Send commands to the low-level ROS nodes (visualization, audio and navigation) and create behaviours.
  • Create new behaviours within the visualization and audio modules.
  • Possibly work on some of the LED domes hardware (ESP32 controller, power supply, ws2812b LED strips) and Arduino code.
  • Collaborate with project 1 on transeferring control back to commander when commander requests this.

Some LED behaviour inspiration: FamiLamp, 16x16x16 LED cube, 10x10x10 LED cube, 500 LED box

Keywords: ROS, Python/C++, embedded, LED, electronics

3. Navigation module

This project entails working on the cyborg_navigation and rosaria nodes (see diagram above). Tasks will include:

  • Work on the robots navigation stack, newly ported to the stack supported by ROS.
  • Navigate to point
  • Manual remote control from commander node and GUI.
  • Help project 1 implement navigation commands and map feedback into GUI
  • Use visual person coordinates from project 4. 
  • Ensure safe driving i Glassgården.
  • Possibly incorporate 3D slam using the ZED stereo camera (see project 4).

Keywords: ROS, Python/C++, shell scripts, state-machine, navigation, SLAM, embedded, electronics

4. Robot vision

2 possible projects:

  • Work on the robots object/person detector software using the YOLO neural network framework. This software detects humans in the camera images and sends reletive coordinates to the ROS system. This allows the robot to seek out people to interact with. The system must be made to work with the new ROS-based navigation stack.
  • If the student has experience with 3D SLAM, this can be incorporated alongside the ROS navigation stack.

These projects also entail some hardware conciderations:

  • The ZED Stereo camera must be mounted on the body
  • An NVidia Jetson TX1 card is available to use for added processing power (as the onboard computer has limited performance). If the student uses this board, it must be mounted within the robot and integrated with the rest of the system. I.e. it must communicate over wire with the main onbaord computer.

Keywords: Robot vision, object detection, navigation, ROS, ZED Stereo camera, NVidia Jetson TX1