artikel haptic simulink

Upload: max-cardenas-mantilla

Post on 03-Apr-2018

218 views

Category:

Documents


0 download

TRANSCRIPT

  • 7/29/2019 Artikel Haptic Simulink

    1/17

    1

    A Haptic Interface Using MATLAB/Simulink

    Magnus G. Eriksson and Jan Wikander

    Mechatronics Lab, Department of Machine DesignThe Royal Institute of Technology

    Stockholm, Sweden

    [email protected], [email protected]

    Abstract The concept of a new haptic system based on a low level interface for

    MATLAB/Simulink is presented. The interface enables users to connect

    dismantled, re-constructed or self-developed haptic devices without any

    commercial drivers or APIs for verification and further development. A virtual

    environment is easily created and connected to the Simulink models, whichrepresent haptic algorithms and real time communication. The paradigm shift to

    start using MATLAB/Simulink with model based programming instead of the

    commonly used C++ programming language for haptic applications will attract

    new users. Users from other scientific areas, e.g. mechatronics, can bring new

    knowledge into the haptic topic and solve control-engineering issues that will

    give the users even more realistic haptic feedback in future applications.

    Keywords Haptic system, real time programming, MATLAB/Simulink, virtual

    reality, force algorithm, control engineering

    1. Introduction

    The work presented in this paper is done in the context of developing haptic and

    visual simulation of surgical procedures [1]. Haptic is the sense of touching

    something and get tactile and kinestic force feedback. A haptic device is connected to

    a virtual world for enabling touch of 3D modeled objects. If there is a collision in the

    virtual world between a tool and the object, the device provides force feedback to the

    user.

    The conventional way of creating these haptic systems is to use a commercial haptic

    device including drivers and Application Programming Interfaces (APIs) in

    conjunction with available haptic libraries (HLibs). The devices cannot be used

    without the licensed drivers and APIs, which limits the freedom of developing haptic

    systems and applications.

    The HLibs used today put high demands on the user skills in the C++ programming

    language. The HLibs are built up with C++ and the implementation of self-developed

    haptic applications must be done in this language.

    There are several HLibs presented on the haptic market. The CHAI libraries [2], the

    eTouch API [3] and the H3DAPI [4] are three C++ based open source alternatives.

    The benefits of using an open source library is firmly the enabling of access to low

    level details; such as adding an arbitrary haptic device or control your own haptic

  • 7/29/2019 Artikel Haptic Simulink

    2/17

    2

    force effects. OpenHaptics [5] and the Reachin API [6] are examples of commercial

    available HLibs.

    The most haptic devices are constructed to have 3-6 Degrees Of Freedom (DOF)

    input sensor signals and 3-6 DOF output actuator signals. Some commonly used

    devices are the Sensables PHANTOM series [7], the Novint Falcon [8], the

    Delta/Omega devices from Force Dimension [9] and the Freedom6S device fromMPB Technologies [10]. Each one of them is introduced with their own commercial

    licensed drivers.

    In this paper we present a new haptic system set up; where a Sensable PHANTOM

    Omni haptic device is dismantled and in low level directly connected to

    MATLAB/Simulink [11]. Simulink enables model based programming instead of

    using the C++ programming language. This paradigm shift in haptics will admit new

    users and extend the haptic topic into new scientific areas. Where the users can be

    more into mathematics and control system engineering than programming experts.

    This will enable further development of the haptic topic, since there are still many

    control-engineering issues to be solved to give realistic haptic feedback to the user.The area of using MATLAB for haptic applications is quite young and unexplored;

    hence, it is difficult to find published information about other systems elsewhere in

    literature. However, Handshake VR Inc. [12] has developed a commercial

    MATLAB/Simulink interface for the Sensable Omni haptic device. In that case the

    OpenHaptics API drivers still need to be used. By using pre-defined haptic Simulink

    blocks which directly is connected to the OpenHaptics API they enable higher level of

    programming with the application. However, our system differs in the way that the

    low level connection to the haptic device conveys use of self-developed control

    algorithms and haptic feedback without any required drivers and APIs. All the

    kinematics, transformations, collision detection and force feedback is modeled by the

    user through real time communication with the sensors and actuators of the device.

    Thus, implementation and verification of self modified or constructed haptic devices

    is possible in this system. Systems based on haptic drivers and APIs do not have this

    flexibility and easy compatibility.

    This system also gives an extra input for education and greater understanding in

    haptics for beginners. Often when conventional graphical-based haptic development

    APIs and environments are used the subjects have problem to separate visualization

    and haptics. They create a 3D object and add a haptic surface to it. In this system all

    the haptic algorithm information is built up in Simulink separately from the graphics.

    The 3D rendering is an extra shell added in the end for nice visualization of the haptic

    collision and movements of the haptic devices. To create it in this way increases theknowledge of the separation between haptic and graphic rendering.

    The requirements of a haptic system that are fulfilled in this work are a minimum

    updating frequency of the haptic loop at 1000 Hz and real time graphic rendering of

    the 3D object to be visualized at 30 Hz [13].

    The paper is organized as follows. Section 2 describes the components used for

    creating the haptic system in MATLAB. Section 3 gives an overview of haptic

    modeling and implementation. In section 4 test results and verification of a specific

    application is presented. Finally, section 5 gives some conclusions and a glance into

    possible future work.

  • 7/29/2019 Artikel Haptic Simulink

    3/17

    3

    2. System Components

    In this section the components used for the running a haptic application on MATLAB

    are presented. Each component is relevant to fulfill the development of a complete

    system. The main components are: the development platform, the real-time control

    tool, the graphic rendering interface and the haptic device.

    2.1 MATLAB / Simulink / Real Time Workshop

    The mathematical-based programming language MATLAB is used as a base for

    development of the haptic interface. MATLAB is an easy to use program for

    development, analysis and visualization of algorithms and numerical computations.

    Simulink is a block library that runs on MATLAB. It provides a model based

    programming language for simulation and analysis. The programming takes place in a

    graphical environment where the algorithms can be simulated and tested for relevant

    data. Simulink is a useful tool to avoid experimental set up and fast simulated results.

    The Real Time Workshop [14] is a plug-in to Simulink, which is building code from

    the blocks during a simulation. The code can then be used for real time applications in

    conjunction with the algorithms created by blocks in Simulink.

    In this research, all the developed algorithms are implemented in Simulink and

    compiled with the Real Time Workshop.

    2.2 dSPACE

    dSPACE [15] is a real-time tool for control prototyping and verification ofmechatronic systems. The compiled code from the Real Time Workshop is

    downloaded to the dSPACE platform, where it will be driven in real time on the

    dSPACE CPU. The basic function of dSPACE is to read sensor signals from some

    external device, manage the signals with the downloaded algorithms, and send

    relevant signals back to the actuators of the device.

    Here the signals of the encoders and potentiometers from the haptic device are read

    and PWM-signals are sent to the motors.

    2.3 Virtual Reality Toolbox

    The Virtual Reality (VR) Toolbox [16] for MATLAB is used for 3D rendering of

    simulated objects. It can either be implemented as a Simulink block or in low level

    MATLAB programming.

    The virtual environment is built up with the VRML-editor V-Realm Builder [17].

    Each geometrical object is defined as a node in the VRML scene graph. Each node

    contains fields, which can be reached from MATLAB. E.g. a sphere is a node that

    contains the field translation. To this field a signal can be sent in from

    MATLAB/Simulink to perform graphic rendering of real time translation of the

    sphere, that is how we did it in this project. The VR Toolbox is usually used to

    demonstrate simulated signals from Simulink as 3D objects. But in our case, real time

    signals are used instead of simulated ones. Therefore dSPACE MLIB-functions areused to take real time signals from the dSPACE platform to MATLAB workspace. A

  • 7/29/2019 Artikel Haptic Simulink

    4/17

    4

    script in the MATLAB workspace transfers the signals with the VR Toolbox to the

    virtual environment. The Blaxxun VRML-viewer [18] enables 3D visualization of the

    rendered objects. The user can explore the virtual environment with several functions

    of the VRML-viewer; such as zoom, rotation and translation.

    2.4 Sensable PHANTOM Omni Haptic Device

    The Sensable PHANTOM Omni haptic device is used as implementation for haptic

    feedback to the user when manipulating virtual objects. Commonly the OpenHaptics

    drivers needs for the device to work with haptic applications. The drivers are not used

    in this project, just the hardware. The haptic device is dismantled and the sensors and

    actuators are low level connected to dSPACE. There are six input signals (3 encoders

    + 3 potentiometers) and three output signals (3 dc motors). Sensor and actuator

    signals are connected to the dSPACE system that also controls timing in the system.

    The device can be seen as an inverted robot arm, where the algorithm reads the

    position and calculates a force feedback to the user if collision in the virtualenvironment occurs. The mechanical construction of the arm consists of a three linked

    robot arm, three dc motors that strengthen wires between the joints and six sensors to

    calculate the position and orientation of the end effector (x, y, z, yaw, pitch and roll).

    There are also two buttons on the device and a pre-defined position location that

    enables calibration. Figure 1 is showing the dismantled PHANTOM Omni haptic

    device.

    Figure 1. The dismantled PHANTOM Omni haptic device.

    2.5 Drivers

    Since the haptic device is dismantled; new drivers are created for easy

    implementation. Motor drivers are used to convert the PWM-signals from dSPACE

    when collision to relevant motor voltage. To enable the whole working range of the

    motor 18V supply is used, which gives a feeling of high stiffness in the haptic

    feedback.

    Other drivers are also created for indication of the buttons and the calibration position

    location. These drivers are switches that give 5V/0V when a button is pushed/not

    pushed. The high or low values give 1/0 to the slave bit in channels on the dSPACEplatform.

  • 7/29/2019 Artikel Haptic Simulink

    5/17

    5

    3. System Overview

    The system overview is presented in figure 2. The haptic interface of using

    MATLAB/Simulink is shortly described as follows. A PHANTOM Omni haptic

    device is dismantled and the sensors and actuators are low level connected to

    dSPACE. The signals are transferred from dSPACE to MATLAB/Simulink (runningon a PC) by MLIB-functions. A virtual reality scene is built up in the MATLAB VR

    Toolbox. The procedure of the algorithm is as follows (inertia is assumed to be

    neglected):

    Read the encoders and use direct kinematics position of the end effector.Check collision detection.If no collision: No signals to the motors.If collision: Calculate a force and transform it to motor torques. Send PWM-signals

    from dSPACE to the motors based on the torques. A PI-controller is used to control

    the motor current.

    Graphic rendering of the tool object and a virtual scene for interaction.

    Figure 2. System overview of the MATLAB/Simulink haptic interface.

    The following subsections are in detail describing the systems functionality.

    VRML Viewer

    VR Toolbox

    Matlab Script

    Simulink Models

    Real Time Workshop

    dSPACE CPU

    Haptic Device

    Motor drivers

    Virtual environment

    Sphere translation

    Position end effectorwith MLIB

    Compile

    Download code

    3 PWM-signals

    3 Motor voltages

    3 Encodersignals

    3 Potentiometersignals

    MATLAB

    1000 Hz

    30 Hz

  • 7/29/2019 Artikel Haptic Simulink

    6/17

    6

    3.1 Kinematic Model

    A direct kinematic model has been developed for the PHANTOM Omni haptic

    device. The six sensor signals (3 encoders + 3 potentiometers) can be reached from

    the dSPACE CPU and the position and orientation of the end effector (x, y, z, yaw,

    pitch and roll) can be calculated. But for the collision detection and haptic algorithmthe position of the end effector is the only parameter of interest. Therefore only the

    three encoder signals are used in the kinematic model to calculate the correct position.

    For the graphic rendering it can be useful with all the six sensors to enable depicting

    of the end effectors rotation.

    The well known Denawit-Hartenberg [19] convention is adopted to define each link

    and frame of the open chain manipulator. The sensor signals and lengths of the links

    are the necessary joint and rigid body variables for an unambiguous solution. See

    figure 3 for a sketch of the links, joints and variables for the PHANTOM Omni.

    Figure 3. A sketch of the links, joints and variables for the PHANTOM Omni.

    All the joints are revolute; hence 1, 2 and 3 will be the controlling variables in thedirect kinematic model. The base frame is O0 and the end effector frame is Oe. Theestablished Denawit-Hartenberg link parameters are specified in table 1.

    Link (Li) Frame (Oi) ai i di iL0 O0 - - - -

    L1 O1 0 /2 0 1L2 O2 a2 0 0 2L3 O3 a3 0 0 3

    Table 1. The established Denawit-Hartenberg link parameters for the PHANTOM Omni.

    a2

    a3

    1

    2

    3z0

    z1

    z2

    z3

    y1

    y2

    y3

    x0,x1

    x2

    x3

    O0,O1O2

    O3=Oe

  • 7/29/2019 Artikel Haptic Simulink

    7/17

    7

    On the basis of the parameters in table 1, the homogenous transformation matrix to go

    from one frame to another can be described as follows. )(1 ii

    iA is the matrix, i is the

    variable and i is frame 0..n.

    =

    1000

    cossin0

    sinsincoscoscossin

    cossinsincossincos

    )(1

    iii

    iiiiiii

    iiiiiii

    i

    i

    id

    a

    a

    A

    Eq. 1

    For the PHANTOM Omni haptic device n=3 and gives the following transformation

    matrices between all frames.

    =

    1000

    0010

    0cos0sin

    0sin0cos

    )(11

    11

    101

    A Eq. 2

    =

    1000

    0100

    sin0cossin

    cos0sincos

    )(2222

    2222

    2

    1

    2

    a

    a

    A Eq. 3

    =

    1000

    0100

    sin0cossincos0sincos

    )(3333

    3333

    3

    2

    3

    aa

    A Eq. 4

    Based on these matrices the homogenous transformation matrix 0eT is computed,

    which yields the position and orientation of the end effector with respect to the base

    frame 0. (Frame 3 is equal to the frame of the end effector).

    ++++

    +

    +

    =

    ===

    1000

    0

    )()()(

    2232323332323232

    212321332131321321321321

    212213332131231321321321

    3

    2

    32

    1

    21

    0

    1

    0

    3

    0

    sascascaccsssccs

    csasssacssaccsssssssscss

    ccascsacccassccsccssccccAAATTe

    Eq. 5

    Where c1 indicates cos(1), s2 indicates sin(2) and so on. In the derivedtransformation matrix the three first rows of the last column give the global position

    of the end effector.

  • 7/29/2019 Artikel Haptic Simulink

    8/17

    8

    ++

    +

    +

    =

    22323233

    21232133213

    21221333213

    __sinsincossincos

    cossinsinsinsincossinsin

    coscossincossincoscoscos

    aaa

    aaa

    aaa

    z

    y

    x

    effectorendpos

    Eq. 6

    The expression of the end effectors position is built up in Simulink by connectingblocks and signals to create an algorithm of the kinematics.

    3.2 Collision Detection Algorithm

    The pre-defined workspace of the virtual environment gives virtual boundaries that

    limit the free space movements. In our case, the workspace is created as a bounding

    box showed in figure 4.

    The position of the end effector is calculated by the kinematic algorithm and thelocations of the virtual walls are pre-defined. By checking the xyz-coordinates of the

    end effector in relation to the boundaries collision detection is performed in real time

    on the dSPACE CPU. The collision detection algorithm is built up with blocks in

    Simulink as presented in figure 5.

    ymin

    zmax

    xmax

    ymax

    zmin

    xmin

    z

    x

    y

    posend_effector

    Figure 4. Workspace in the virtual environment is pre-defined as a bounding box.

  • 7/29/2019 Artikel Haptic Simulink

    9/17

    9

    Figure 5. The collision detection algorithm for xyz-coordinates created in Simulink.

    3.3 Haptic Feedback

    If the collision detection algorithm finds collision between the sphere, which

    illustrates the depicted position and movements of the end effector, and the virtual

    walls a haptic feedback will be sent to the user. The force feedback gives the user a

    sense of touching the virtual walls and the sphere can be dragged along theboundaries.

    The basic idea with the haptic algorithm is based on the well known proxy-probe

    method. The probe position is equal to the global position of the end effector

    independent if collision or not. The proxy position is the position on the boundary

    where collision occurs. The force algorithm is a modified spring-damper model

    edcedkF += , where the spring constant, k, and the damper constant, c, arearbitrary chosen. d is the distance between the proxy and the probe and e is thenormalized gradient to the collided surface. See figure 6.

    Figure 6. The haptic algorithm when collision occurs in the virtual environment.

    F

    d

    e Probe

    Proxy

    Free Workspace

    Boundary

  • 7/29/2019 Artikel Haptic Simulink

    10/17

    10

    The motivation of using the spring-damper model as a relevant haptic algorithm is as

    follows. Assume that the probe and the proxy are two dynamical masses that moves in

    relation to each other connected by a spring and a damper. The derivation and

    motivation is based on the approximations proved in figure 7.

    Figure 7. Approximations used for derivation of using the spring-damper model as a haptic algorithm.

    Based on the approximations from figure 7 the following derivation shows that

    spring-damper model gives a relevant haptic feedback for the probe-proxy case.

    Newtons second law gives: =

    xmFx

    Eq. 7xmFgmxcxk probeexternalprobe =++ Eq. 8

    The probe is assumed to be weightless by motivation from above = 0probem

    =+ 0externalFxcxk Eq. 9

    xcxkFexternal += Q.E.D.

    mproxy

    mprobe

    Approximate the proxy witha wall that not moves.

    mprobe

    The probe can be assumed tobe weightless; since if noexternal force (no collision):Equilibrium and probe=proxy atthe wall or in free workspace.

    Probe=Proxy

    If there is collision detected: Anexternal force is applied basedon the spring-damper model.

    mprobe

    Fexternal

    Create a free body diagram inequilibrium of the probe object.

    x=the distance the probe hasbeen moved from the proxy (wall)

    by the external force.gmprobe

    xk xc

    externalF

    x

  • 7/29/2019 Artikel Haptic Simulink

    11/17

    11

    3.4 Motor Torques

    The force that occurs if collision is detected must be transformed to corresponding

    motor torques for the three motors. Inertia, and hence dynamic impacts, is assumed to

    be neglected; therefore will not the motor torques be dependent of velocity and

    acceleration. The mechanical construction of the PHANTOM Omni gives that two ofthe motors rotate around the global x-axis and one motor rotates around the global z-

    axis. See figure 8.

    Figure 8. Relevant parameters for the torque algorithm.

    The torque algorithm is described in relation to the information given in figure 8. The

    three known positions from the origin 1p , 2p and endp give the corresponding vector

    directions of the links 1L , 2L and 3L . The three dimensional force vector is

    determined from the haptic algorithm and located at the position of the end effector.

    The algorithm gives the torques 1T , 2T and 3T in three dimensions for each motor, but

    T1z, T2x and T3x are the only used components.

    011= pL Eq. 10

    122 ppL = Eq. 11

    23 ppL end = Eq. 12

    ( ) zTkjiFLLLT 13211 ......... +=++= Eq. 13( ) xTkjiFLLT 2322 ......... +=+= Eq. 14

    xTkjiFLT 333 ......... +== Eq. 15

    F

    1L

    2L

    3L

    1p

    2p

    endp

    0

    zT1

    xT2

    xT3 x

    y

    z

  • 7/29/2019 Artikel Haptic Simulink

    12/17

    12

    3.5 PWM-signals

    A PWM-signal must be sent to control a motor from the dSPACE CPU. Hence, the

    above mentioned motor torques is changed to currents based on gear ratio and motor

    type. The currents (one for each motor) are normalized to relevant PWM-signals (0-

    1). The PWM signals are sent to the motors that strengthen the wires of the hapticdevice and give the user a feeling of force feedback when collision is detected in the

    virtual environment.

    3.6 PI-control of the Current

    An error is established by comparing the measured actual motor currents and the

    calculated motor currents. A PI-controller is implemented to reduce the errors of the

    algorithm that calculates the motor currents. The integration part of the controller is

    removing the static error and summing the current difference every time a PWM-

    signal is sent to the motors. Hence, after a while the errors are reduced. The P-factor

    is tested and verified for a specific constant value. The control signal of the current is

    sent back to the motor. See figure 9.

    Figure 9. Simulink model of the PI-controller for one motor.

    The real current in the motor is measured by using an operational amplifier (Opamp).

    A resistor is mounted in serial connection with the motor, and the Opamp is

    measuring the voltage drop over the resistor when activating the motor. The voltage

    drop is taken in to the dSPACE platform. The motor current is received by dividing

    the voltage drop with the resistor value.

    3.7 Graphic Rendering

    To do the haptic feedback more understandable a 3D virtual environment is built up

    and visualized. The collision detection is based on max/min- boundaries along the

    xyz-axes, which easily is rendered as a virtual cube having the walls at the given

    boundary values. A small sphere is graphic rendered to illustrate the movements of

    the end effector of the haptic device. The sphere follows the movements of the end

    effector in real time at 30 Hz, which enables visualization of the collision with thevirtual walls. There are no deformations of the collided walls; hence just the

  • 7/29/2019 Artikel Haptic Simulink

    13/17

    13

    translation of the sphere needs to be updated in the graphics loop since the other

    objects are static unchanged. The virtual environment presented in figure 10 is

    rendered using the MATLAB VR Toolbox.

    Figure 10. The virtual environment including the walls and the sphere.

    The work presented in sub-sections 3.1-3.6 is implemented and built up in Simulink

    and by using the real time workshop it is compiled and downloaded to the dSPACE

    processor. All the collision detection and haptic feedback is performed on the

    dSPACE platform at 20 kHz in real time. To extend the solution with 3D visualization

    a virtual environment is created and connected to the system through the MATLAB

    VR Toolbox, which is running on the PC. The position of the end effector is in real

    time transferred with MLIB functions from the dSPACE CPU to the PC forvisualization. The translation of the small sphere in the virtual environment directly

    follows the position of the end effector in real time.

    4. Test Results and Verification

    In this paper some early test results and verification of the haptic algorithm is

    presented. The test is based on the application described above with the small sphere

    that collides with the walls in the virtual environment.

    4.1 Product Specification

    A Pentium D 2.8 GHz with 2.0 GB RAM desktop PC was used for this application.

    The graphic card is an Intel 82945G Express. A dismantled Sensable PHANTOM

    Omni is used as a haptic device. No drivers or API components are required since

    direct connection to low-level sensors and dc-motors. The dSPACE CPU platform is

    used for real time controlling and routing of signals. MATLAB 7.2 and Simulink 6.4

    were used with the Real Time Workshop. The MATLAB Virtual Reality Toolbox 4.0

    was used for graphic rendering.

  • 7/29/2019 Artikel Haptic Simulink

    14/17

    14

    4.2 The Application

    A 3D rendered small sphere is following the movements of the end effector and

    collision detection is performed against pre-defined virtual walls. The virtual

    environment is created in the V-Realm Builder VRML-editor. A MATLAB m-file

    consisting of all variables must be updated before compiling the Simulink model anddownload it to the dSPACE platform for every new application.

    4.3 Test Procedure and Results

    The developed haptic platform described above has been tested and verified for the

    basic application of the sphere colliding with virtual walls along the global xyz-axes.

    A user is holding the haptic device and manipulating the virtual objects. A haptic

    feedback is sent to the device when the operator moves the sphere so it collides with

    the walls, which gives the user a sense of kinestic and tactile feedback. Test data has

    been logged from a specific test, where the user is dragging the sphere against one

    wall and relevant data are saved. The logged data are the position of the probe relative

    to the wall, the calculated force, the torques and the PWM-signals for the three

    motors. See figure 11-14. The globally defined boundary conditions of the walls and

    the pre-defined spring and damper constants of the force algorithm were also used for

    the analysis of the system. The test procedure took place for one certain case of pre-

    defined parameters and logged data. The stiffness constant in the haptic algorithm was

    set to a high value to give the user a sense of collision between two rigid materials. As

    mentioned above the virtual scene was 3D rendered for visual feedback.

    Figure 11. Position of the sphere relative to

    the wall.

    Figure 13. The modeled torques for the three

    motors.

    Figure 12. The magnitude of the calculated

    force.

    Figure 14. The PWM-signals for the three

    motors.

  • 7/29/2019 Artikel Haptic Simulink

    15/17

    15

    The result from figure 11 indicates that the position of the sphere follows the position

    of the wall very well, but follows the user movements into the material when applying

    higher force. The surface is modeled to be quite stiff; therefore the probe is not deeper

    in to the material which had been the case for a surface modeled with a lower spring

    constant. From figure 12 it can be seen that the magnitude of the calculated force

    directly follows the penetration distance of the surface: this is as expected. The forceis transformed to corresponding motor torques for the three motors. Figure 13 depicts

    that the modeled torques (T1z, T2x and T3x) are varying as the user is dragging and

    pushing the sphere along the surface of the wall. T1z is zero because no collision in the

    y-direction. Any other conclusions are hard to draw. The PWM-signal sent from

    dSPACE to each motor directly follows the torque, as illustrated in figure 14. It is

    hereby verified that the force algorithm and the MATLAB/Simulink haptic system

    works properly for this application.

    There have also been blind tests with subjects that never tried haptics before; and they

    recognize that you really get a realistic perception of touching the virtual objects.

    5. Conclusion and Future Work

    In this work a haptic interface for MATLAB/Simulink has been developed. A

    PHANTOM Omni haptic device is dismantled and the sensors and actuators are low

    level connected to a dSPACE platform for real time communication. The haptic

    algorithm including kinematics, collision detection, force calculation, transformations

    to motor torques and implementation of a PI controller is modelled in Simulink. The

    developed algorithms built up of Simulink blocks are compiled with the Real Time

    Workshop. A virtual reality scene is built up in the MATLAB VR Toolbox for realtime 3D visualization.

    There are some major benefits to use this system:

    MATLAB/Simulink enables model based programming instead of using theC++ programming language.

    The low level connection to the haptic device conveys use of self-developedhaptic algorithms without any required drivers and APIs.

    Easy implementation and verification of self modified or constructed hapticdevices is possible in this system.

    In this system all the haptic algorithm information is built up in Simulink

    separately from the graphics. To create it in this way increases the knowledgeof the separation between haptic and graphic rendering.

    The developed haptic platform has been tested and verified for the basic application

    of a 3D rendered small sphere that follows the movements of the end effector and

    collision detection is performed against pre-defined virtual walls. By good results

    from the tests it was hereby verified that the force algorithm and the

    MATLAB/Simulink haptic system works properly for this application.

    Possible future work could include some of the following suggestions:

    To create other applications will include more advanced collision detectionalgorithms for arbitrary modeled 3D objects.

  • 7/29/2019 Artikel Haptic Simulink

    16/17

  • 7/29/2019 Artikel Haptic Simulink

    17/17

    17

    U_drop Voltage drop [V]

    x Spring length [m]x Change of spring length [m/s]

    x Acceleration in x-direction [m/s2]

    x,y,z Global coordinate system

    xi,yi,zi Local coordinate system at Oi

    References[1] Eriksson M.G., Haptic and Visual Simulation of a Material Cutting Process,

    Licentiate Thesis, KTH Stockholm Sweden; 2006.

    [2] CHAI 3D, http://www.chai3d.org.[3] eTouch API, http://www.ethouch3d.org.[4] H3DAPI, http://www.h3dapi.org.[5] OpenHaptics, http://www.sensable.com/products-openhaptics-toolkit.htm.[6] Reachin API, http://www.reachin.se.

    [7] Sensable Inc. PHANTOM, http://www.sensable.com/products-haptic-devices.htm[8] Novin Falcon, http://home.novint.com/products/novint_falcon.php.[9] Force Dimension, http://www.forcedimension.com/fd/avs/home.

    [10] MPB Technologies, http://www.mpb-technologies.ca/mpbt/haptics.[11] MATLAB/Simulink, http://www.mathworks.com/.

    [12] Handshake VR Inc., http://www.handshakevr.com/.[13] Mark W.R., Randolph S.C., Finch M., Verth J., Adding force feedback to graphic

    systems: issues and solutions, 23rd

    Conference on computer graphics; 1996, pp. 447-52.[14] Real Time Workshop, http://www.mathworks.com/products/rtw/.[15] dSPACE, http://www.dspaceinc.com/ww/en/inc/home.cfm.[16] The Virtual Reality Toolbox, http://www.mathworks.com/products/virtualreality/.[17] V-Realm Builder, http://www.ligos.com/.[18] Blaxxun VRML-viewer, http://www.blaxxun.com/.[19] Sciavicco L., Siciliano B., Modeling and control of robot manipulators, 2nd edition,

    Springer; 1999, pp. 39-77.

    [20] Flemmer H., Control design and performance analysis of force reflective teleoperators a passivity based approach, Doctoral thesis, KTH Stockholm Sweden; 2004.