PROJECTS IN THE INTELLIGENT ROBOTICS LABORATORY IN YEAR 2004.

Marek Perkowski, Professor ECE, Intelligent Robotics Laboratory



PROJECT NUMBER 1

"HEXOR ROBOT" OR HOW TO FIND A SODA CAN IN A MESSY ROOM.

PEOPLE: Nathalie Metzger, Person 1, Person 2.

GROUP MANAGER, SOFTWARE INTEGRATION: Nathalie Metzger.

ROBOT VISION: Bruce Yen.

SOFTWARE ON PC, INTERFACE, RADIO: Rajiv Sudunagunta.

PLANNING AND FUZZY LOGIC: Christopher Brawn.

EQUIPMENT: HEXOR ROBOT, arms for Hexor, TV card for a PC, Labyrinth. If you want you can add more sensors, such as microphones. You should also add a speaker on the robot. The commands to the speaker are transmitted by radio from the controlling PC.

PROJECT STARTED: January 3, 2004.

The goal of this project is to familiarize students with some of the basic technologies used in robotics. Particularly: remote control by radio, sensors, image processing, walking robot control, visual feedback.

PROJECT GOAL: You have to adapt the Hexor robot to be able to do the following:
  1. Collect literature (papers, books, webpages) on the following topics:
    1. Obstacle avoidance by mobile robots.
    2. Shortest path algorithms in labyrinths.
    3. Robots in mazes - algorithms, sensors, gripping and transporting some items from location to location.
    4. Algorithms for soda cans collection - see MIT.
    5. Speech generation by robots.
  2. You have to build a square labyrinth 4 meters * 4 meters (or similar) with different kinds of obstacles. Obstacles should be movable so you can design or generate randomly various kinds of labyrinths. Experiment with few and many different types of obstacles.
  3. Put a small speaker on the robot to which the commands from the controlling PC will be spoken.
  4. Grasp items by grippers. The robot will have two grippers/arms controlled remotely from the PC using visual feedback of a controlling human. Human controls remotely by viewing the image from TV camera located on a robot. This image is transmitted by radio to his PC. Example: grasping a soda can.
  5. Transport simple items such as soda cans or boxes from location to location. We assume that all objects are located on floor level.
  6. Navigate a labyrinth full of various kinds of obstacles. Obstacles are of different kinds, colors, shapes and sizes. They may block certain kind of sensors such as infrared but not other such as TV camera.
  7. Map the whole environment to be sure that all cans have been collected.
  8. Try to collect as many cans in as short period of time as possible. A competition between various programs/designs will be made to find the winner. For instance, 15 minutes will be given and it will be calculated how many a total of cans were collected in 10 runs.
  9. The cans should be put in order somewhere in the edge of the labyrinth and the robot has to communicate by voice "I have collected five cans" or something like that. You can add more speech behaviors to the robot. The robot may communicate its decisions or internal states through speech, such as "I am searching way out of this loop" or "Now I am trying to go back to the entrance", or finally "my battery is low, I am dying".
  10. All the intelligence will be in the PC. The processor on board will be used only to collect information from servos and create lowest level motor control. This way you will be able to use the most sophisticated programming concepts of learning, knowledge acquisition, obstacle avoidance, etc and program them in any sophisticated language of your choice without concern for how much memory you have.

I need a complete report with plan of your work and information about your algorithms and ideas by February 1. Person responsible for the report: Natalie. Please write me about the mechanical design of labyrinth. What you can find in the lab, what you need to purchase? This has a high priority, since your group needs to build the labyrinth for experiments.
Problems to be discussed in the report and presentations.
  • Gaits for hexapods. General principles, types of gaits. Gaits for Hexor. New commands for rotation and turning right/left. New gaits?
  • Grippers. Function and control.
  • Architecture of the board on the robot.
  • Radio communication and formats.
  • How to have all sensor readings available on PC. What is the delay?
  • Algorithms for labirynth navigation.
  • Algorithms for environment mapping.
  • Algorithms for can collection:
    1. First do entire mapping, then plan shortest path to collect cans. Collect two cans at once.
    2. Whenever you see a can, take it and put to storage, assuming the predicted shortest path.
    3. A combination of the above two algorithms.
  • Computer Vision algorithms used to map building. What algorithms? Where to find them? Tests.
  • Sonar used to map building.
  • Infrared sensors used to map building.
  • Integration of three maps: one from camera, one from sonars and one from infrared to a single map. This man is 2.5 dimensional (height of obstacles taken into account).
  • Shortest path algorithms. Dijkstra. A*. Genetic. Simulated Annealing. Mixed.
    READINGS FOR NATALIE FOR HOMEWORK 4.
    Your task is to learn about generation of emotional speech/dialog patterns with lips synchronization. Discuss two top research groups. What software is available for free? What are the main research issues to be solved? Create an excellent PPT presentation and write a report. Put them both to your WWW Page and give to me on paper.
    1. Speech information from Cynthia Breazeal, MIT
    2. Shikano Laboratory System ASKA in Japan.


    PROJECT NUMBER 2

    "PROFESSOR PERKY" OR HOW TO USE MACHINE LEARNING BASED ON RELATION DECOMPOSITION TO CONTROL HOW THE UGLY MALE ROBOT HEAD MOVES AND WHAT IT SAYS.

    PEOPLE: Stefan Gebauer, Normen Giesecke, Aminul, Robert Klug, Myron Machado, New-Undergrad-Student.

    GROUP MANAGER AND SPEECH: Stefan Gebauer.

    MECHANICAL DESIGN, SENSORS AND INTERFACING, MOVEMENT CONTROL: New student. Works closely with Stefan and Myron.

    MACHINE LEARNING SPECIALIST: Normen Giesecke. Works mostly with Stefan. Organization of evaluation testing.

    ROBOT VISION AND INTEGRATION: Myron Machado.

    MACHINE VISION: Robert Klug. Development and integration of algorithms for face localization and face recognition.

    SPEECH: Aminul (temporary). Complete the software as promised.

    EQUIPMENT:
    1. Professor Perky robot. Possibly add simple touch or temperature sensors. Use microphone and camera, but do not put the camera to the eyes.
    2. Camera is stand alone, may be controlled from the PC.
    3. Interfaces to the PC.
    4. Fonix software.
    5. Intel Vision Library.
    6. CMU Image Processing and robotics software.


    The goal of this project is to familiarize students with several robotics technologies: voice synthesis, voice recognition, computer vision, and human-humanoid robot interface. Other goals are:
    1. Manual control of robot movements. "Emotions".
    2. Simple robotic control language.
    3. Learning by relation decomposition and logic synthesis using MVSIS.
    This is a continuation of Fall project. It is directly related to "work report" of Robert and thesis of Stefan.

    I need an improved report on February 6. Contact Aminul, when he will be done. How to integrate his work. Stefan, please assign work to the new student early. We will discuss this additionally in Thursday meetings of my research group.

    This should be NOT an attempt at a realistic robot, its behavior should be exaggerated, as in a puppet theatre or Pizza Hut robot shows. It should be a caricature.

    PROJECT NUMBER 3

    "EMOTIONAL LADY" OR HOW TO USE DESIGN ROBOT TO DEMONSTRATE FEMALE EMOTIONS. Invent a name for your robot. It cannot be Cynthia or Maria. Should be some female name, perhaps.

    PEOPLE:

    GROUP MANAGER AND PROGRAMMING EMOTIONS: Person 1

    MECHANICAL DESIGN, SENSORS AND INTERFACING, MOVEMENT CONTROL: Person 2. Possible more people here, interested in mechanical design and control (with software) to demonstrate female emotions.

    ROBOT VISION AND INTEGRATION: Person 3.

    SPEECH: Person 4.

    EQUIPMENT:
    1. Female robot kit. Use female skin, hair, etc.
    2. Camera is stand alone, may be controlled from the PC.
    3. Interfaces to the PC.
    4. Fonix software.
    5. Intel Vision Library.
    6. CMU Image Processing and robotics software.


    The goal of this project is to create realistic human-like emotions. You have to invent the personality of this robot, his knowledge, responses, profession, etc. It is similar to writing a scenario. Try to make this female as female as possible by the entire design. This should be an attempt at a realistic robot, not an animal-like or a caricature.

    TASKS:
    1. Read about "emotional robots". There are several books and WWW Pages. Find articles. Read Cynthia Breazeal Ph.D. thesis.
    2. Find some book (Martin has one) that has models of movements of human face muscles to represent various emotions. There is a rich literature about animiation of characters for video games and cartoons. You can use 3D modelling first.
    3. Write a scenarion for a conversation with your robot lady where she will be able to go from one emotion to another. The emotions should be:
      1. anger
      2. smile
      3. surprize
      4. boredom
      The emotions as expressed by facial gestures are linked to what the robot says, and what is says depends on what it hears from the human. Also, the robot can react with its state to whom it says. For instance, the robot can smile whenever it sees person X and frown when it sees person ZX.



    PROJECT NUMBER 4

    "EMOTIONAL MAN" OR HOW TO USE DESIGN ROBOT TO DEMONSTRATE MALE EMOTIONS. Invent a name for your robot. It cannot be Professor Perky. Should be some male name, perhaps.

    PEOPLE: Randy Borck, Ray Schmelzer, Jeff Allen, Sian Rees.

    GROUP MANAGER AND PROGRAMMING EMOTIONS: Person 1

    MECHANICAL DESIGN, SENSORS AND INTERFACING, MOVEMENT CONTROL: Person 2. Possible more people here, interested in mechanical design and control (with software) to demonstrate female emotions.

    ROBOT VISION AND INTEGRATION: Person 3.

    SPEECH: Person 4.

    EQUIPMENT:
    1. Female robot kit. Use male skin, hair (or lack of), etc. More modifications are necessary, familiarize yourself with male skins that we have in the lab. You may add more degrees of freedom and movements typical for a male. Stronger motors will be useful for this project perhaps. Use springs and gears. Rubber strings?
    2. Camera is stand alone, may be controlled from the PC.
    3. Interfaces to the PC.
    4. Fonix software.
    5. Intel Vision Library.
    6. CMU Image Processing and robotics software.


    The goal of this project is to create realistic human-like emotions. This project is similar to Projects 2 and 3, the group can share experiences and software but the final scripts will be different. Try to make this male as male as possible by the entire design. This should be an attempt at a realistic robot, not an animal-like or a caricature.


    PROJECT NUMBER 6

    EVOLVABLE ROBOT THAT LEARNS BY ITS OWN MISTAKES HOW TO AVOID OBSTACLES. Invent a name for your robot. EvoBot?

    PEOPLE: Kavitha Ramasamy.

    EQUIPMENT:
    1. Walking robot with controller.
    2. ALTERA protoboard without connectors.
    3. 3 female connectors (you have to purchase them).
    4. Altera VHDL software.



    PROJECT NUMBER 7

    FAITHUL ROBOT FOR THEATRIC PERFORMANCE.

    PEOPLE: RAS society.

    EQUIPMENT:
    1. PEOPLE-BOT mobile robot.
    2. Female torso with hands.
    3. Female robot head Maria. (the one that Martin has).
    4. Laptop Computer from Dr. Perkowski.
    5. You have to create a shelf on the PEOPLEBOT body to attach the lapotop.
    6. You have to attach the torso on top of PEOPLEBOT.
    7. You have to attach the camera somewhere (think where, but not in eyes of the robot).
    8. You have to attach the head on top of the torso. No hand design. This is mostly sofware project.