Icon for: Allison Okamura

ALLISON OKAMURA

Stanford University
Public Discussion

Continue the discussion of this presentation on the Multiplex. Go to Multiplex

  • Icon for: Vivian Guilfoy

    Vivian Guilfoy

    Facilitator
    Senior Advisor
    May 11, 2015 | 08:47 a.m.

    Haptics is a fascinating topic (and new to me). Application to many different fields and education holds great promise. Can you say more or give examples of what it would be like to use haptics for frog dissection—or to interact with students around the globe using haptics? Who are the students currently using the Hapkit in your course? What physical sensations have they created? How do you plan to help more teachers and students learn about this area of work?

  • Icon for: Allison Okamura

    Allison Okamura

    Lead Presenter
    Associate Professor
    May 12, 2015 | 02:22 a.m.

    Thanks for your questions!

    Haptics can allow users to experience a virtual world through touch. So when you do a hands-on lab (any lab — from frog dissection to pouring chemicals) you have to ask: is it important to do it hands-on vs. in a computer simulation? Haptics can be used to bridge the physical and virtual worlds to both (1) test hypotheses about whether it is important to be hands on and (2) provide a safe, ethical, and long-term cost-effective way to do virtual experiments that feel real.

    In our first online class (2013), there was an application process and the students ranged from middle school to professionals. That class had 100 people, and we delivered the devices to the students by mail. In our current “self-paced” online haptics class (started Fall 2014), students need to purchase the parts and assemble the devices themselves. Because of the cost and required resources, making the device is optional. So while thousands of people are registered, only a fraction are making the device. We’ll know by the then of the year how many actually made the device, and what difference it had in their experience of the course material.

    Physical sensations created include springs, virtual walls, dampers, and textures. (The device is only one degree of freedom.) We are also developing software to allow haptic display of functions, as well as haptic video games.

    We currently have a NSF Cyberlearning background to help us understand how to make the device and its interface better for dissemination. In fact, this week we are doing haptic labs in a local middle school. We are iteratively designing and testing.

    Anyone interested in learning more can go to:
    http://hapticsonline.class.stanford.edu
    http://hapkit.stanford.edu

  • Icon for: Vivian Guilfoy

    Vivian Guilfoy

    Facilitator
    Senior Advisor
    May 12, 2015 | 08:12 a.m.

    The results of your course and the analyses of impact when actually making the device should be very interesting…as well as What happens with middle school student labs. I’m wondering whether businesses could be recruited to support the cost of constructing the devices for classrooms—as a way for business to demonstrate their interest not only in learning, but in cutting edge studies for young students—and future workers.

  • Icon for: Debra Bernstein

    Debra Bernstein

    Facilitator
    Senior Researcher
    May 11, 2015 | 03:01 p.m.

    I am also new to haptics, although certainly interested to learn more! Can you say more about the types of feedback the Hapkit can provide to students, and the role you think that feedback can play in science learning (for example, since you mentioned the frog dissection)?

  • Icon for: Allison Okamura

    Allison Okamura

    Lead Presenter
    Associate Professor
    May 12, 2015 | 02:26 a.m.

    The feeling of material properties (okay, frog dissection might sound gross!) from simple springs and dampers to complex viscoelasticity can give students insight about physics, biology, etc. In another example, students could feel the atomic bonds in a “scaled up” force display for chemistry learning. The main idea is that feeling a virtual environment representing the science at hand will improve students’ intuition. Multi-modal interactions (e.g., vision + sound + touch) provides immersion and reinforcement that we hypothesize will improve understanding and retention.

  • Icon for: Debra Bernstein

    Debra Bernstein

    Facilitator
    Senior Researcher
    May 12, 2015 | 02:04 p.m.

    Thanks for your response. Since you mentioned multi-modal interactions, are you picturing that the Hapkit will be one of several modes of interaction used during a learning experience (i.e., used in conjunction with sound and visuals)? Also, for frog dissection I can see how haptic feedback would be useful there, to get a sense of the elasticity or thickness of different parts of the body… so, gross but very useful!

  • Icon for: Allison Okamura

    Allison Okamura

    Lead Presenter
    Associate Professor
    May 12, 2015 | 04:46 p.m.

    And maybe smell. :) Really, it is still a research question how many senses are helpful in a learning experience, and it is certainly going to specific to the topic being learned. These are questions we are trying to answer through our long-term research.

  • Icon for: Debra Bernstein

    Debra Bernstein

    Facilitator
    Senior Researcher
    May 12, 2015 | 09:15 p.m.

    Those are great questions! It sounds like you’ve targeted a few different fields of study for your research, such as biology and physics learning. Are there others?

  • Icon for: Allison Okamura

    Allison Okamura

    Lead Presenter
    Associate Professor
    May 12, 2015 | 10:48 p.m.

    We originally used the predecessor to the Hapkit, called the Haptic Paddle, to teach engineering undergraduate and graduate courses in Dynamic Systems as well as Haptics (of course) and Robotics. In our current research, we are focusing on physics and math.

  • Icon for: Debra Bernstein

    Debra Bernstein

    Facilitator
    Senior Researcher
    May 14, 2015 | 07:58 p.m.

    Thanks. This has been a great discussion. Thanks for your responsiveness, and all the best of luck with the project!

  • Icon for: Nevin Katz

    Nevin Katz

    Facilitator
    Technical Associate
    May 12, 2015 | 08:18 a.m.

    This sounds like a fascinating endeavor. Could you talk more about how a haptics device would work when controlling a surgical robot?

    Also, very interesting that it a haptics unit can be compared to an Arduino, which I’ve heard a lot about. How much computer science background would a student need to effectively program a haptics unit?

  • Icon for: Allison Okamura

    Allison Okamura

    Lead Presenter
    Associate Professor
    May 12, 2015 | 04:50 p.m.

    Yes — most surgical robots today are teleoperated systems. That is, the surgeon manipulates a master “joystick” with many degrees of freedom, and those motions are used to command the behavior of the patient-side robot. Such teleoperation is useful because it allows for motion scaling between the master and patient sides, as well as image alignment, better ergonomics for the surgeon, and more dexterity inside the patient through a small incision (compared to manually minimally invasive surgical instruments). What haptics would do for such a system is allow the surgeon to feel their interaction with the patient’s tissues, via the master of the teleoperation system. This is not typically available clinically today.

    The Hapkit Board used to control our device is based on the design of the Arduino Uno, we just added additional sensors and motor amplifiers so all the electronics is on a single compact board. Not much CS background is needed at all; but some minimal experience with writing code is necessary. We provide tutorials about this in the online course.

  • Icon for: Nevin Katz

    Nevin Katz

    Facilitator
    Technical Associate
    May 14, 2015 | 01:14 p.m.

    Fascinating! I’ll take a look at the course. Could you talk a bit about how these surgical robots are tested in the medical field? Does it start in the lab and then get tested in hospitals by physicians?

  • Icon for: Allison Okamura

    Allison Okamura

    Lead Presenter
    Associate Professor
    May 14, 2015 | 01:19 p.m.

    Indeed, they start in research labs and companies and are then tested in the operating room (typically first on tissue, then on cadavers and/or animals, then on human patients after approval).

  • Icon for: Nevin Katz

    Nevin Katz

    Facilitator
    Technical Associate
    May 14, 2015 | 01:31 p.m.

    That sounds awesome. In the field, what is the level of biology background that a programmer needs for writing code that is used for the bots? Is there a lot of interaction between programmers and physicians when developing their programming?

  • Icon for: Allison Okamura

    Allison Okamura

    Lead Presenter
    Associate Professor
    May 14, 2015 | 01:36 p.m.

    Close interaction with clinician collaborators is most important during the design and specification phase, and then during testing phases. The “programming” itself is usually done only by the engineering researchers. People in my lab do not have any biology background — rather we consult with clinician collaborators to learn about specific procedures. This is a little different in other areas of medical robotics, such as rehabilitation, where the engineering researchers need to understand human motor control quite deeply in order to design devices and algorithms.

  • Icon for: Nevin Katz

    Nevin Katz

    Facilitator
    Technical Associate
    May 14, 2015 | 01:39 p.m.

    Fascinating! Thanks for entertaining all my questions!!

  • Icon for: Allison Okamura

    Allison Okamura

    Lead Presenter
    Associate Professor
    May 14, 2015 | 11:26 p.m.

    Thanks!

  • Icon for: Vivian Guilfoy

    Vivian Guilfoy

    Facilitator
    Senior Advisor
    May 12, 2015 | 04:01 p.m.

    Following this discussion makes me think that it might be a great idea to produce a video which shows the process and integrates it with student descriptions of what it is like to use it for a particular exploration.

  • Icon for: Allison Okamura

    Allison Okamura

    Lead Presenter
    Associate Professor
    May 12, 2015 | 04:52 p.m.

    Our online class includes a number of more “tutorial”-oriented videos about how to program and use the device, please see http://hapticsonline.class.stanford.edu.

  • Icon for: Vivian Guilfoy

    Vivian Guilfoy

    Facilitator
    Senior Advisor
    May 13, 2015 | 09:47 a.m.

    Thank you. Will check it out to deepen my own understanding of this fascinating field of study.

  • Icon for: Vivian Guilfoy

    Vivian Guilfoy

    Facilitator
    Senior Advisor
    May 13, 2015 | 09:57 a.m.

    I just watched your online course introductory video and loved the example of the vibration of our phones to alert us to incoming calls or messages. In your report on this project, are you planning to use student-generated responses to using the hapkit that helps to “demystify” this topic for others? Just curious.

  • Icon for: Allison Okamura

    Allison Okamura

    Lead Presenter
    Associate Professor
    May 14, 2015 | 12:39 a.m.

    Certainly, this is part of our current Cyberlearning studies.

  • Icon for: Vivian Guilfoy

    Vivian Guilfoy

    Facilitator
    Senior Advisor
    May 14, 2015 | 04:12 p.m.

    Thanks for a rich discussion into this field of study. I learned a lot.

  • Icon for: Allison Okamura

    Allison Okamura

    Lead Presenter
    Associate Professor
    May 14, 2015 | 11:26 p.m.

    Thanks!

  • Further posting is closed as the showcase has ended.