Touchable 3D video system

Jongeun Cha, Mohamad Eid, Abdulmotaleb El Saddik

    Research output: Contribution to journalArticle

    Abstract

    Multimedia technologies are reaching the limits of providing audio-visual media that viewers consume passively. An important factor, which will ultimately enhance the user's experience in terms of impressiveness and immersion, is interaction. Among daily life interactions, haptic interaction plays a prominent role in enhancing the quality of experience of users, and in promoting physical and emotional development. Therefore, a critical step in multimedia research is expected to bring the sense of touch, or haptics, into multimedia systems and applications. This article proposes a touchable 3D video system where viewers can actively touch a video scene through a force-feedback device, and presents the underlying technologies in three functional components: (1) contents generation, (2) contents transmission, and (3) viewing and interaction. First of all, we introduce a depth image-based haptic representation (DIBHR) method that adds haptic and heightmap images, in addition to the traditional depth image-based representation (DIBR), to encode the haptic surface properties of the video media. In this representation, the haptic image contains the stiffness, static friction, and dynamic friction, whereas the heightmap image contains roughness of the video contents. Based on this representation method, we discuss how to generate synthetic and natural (real) video media through a 3D modeling tool and a depth camera, respectively. Next, we introduce a transmission mechanism based on the MPEG-4 framework where new MPEG-4 BIFS nodes are designed to describe the haptic scene. Finally, a haptic rendering algorithm to compute the interaction force between the scene and the viewer is described. As a result, the performance of the haptic rendering algorithm is evaluated in terms of computational time and smooth contact force. It operates marginally within a 1 kHz update rate that is required to provide stable interaction force and provide smoother contact force with the depth image that has high frequency geometrical noise using a median filter.

    Original languageEnglish (US)
    Article number29
    JournalACM Transactions on Multimedia Computing, Communications and Applications
    Volume5
    Issue number4
    DOIs
    StatePublished - Oct 1 2009

    Fingerprint

    Friction
    Multimedia systems
    Median filters
    Surface properties
    Surface roughness
    Cameras
    Stiffness
    Feedback

    Keywords

    • Haptic rendering algorithm
    • Haptic surface properties
    • Video representation

    ASJC Scopus subject areas

    • Hardware and Architecture
    • Computer Networks and Communications

    Cite this

    Touchable 3D video system. / Cha, Jongeun; Eid, Mohamad; Saddik, Abdulmotaleb El.

    In: ACM Transactions on Multimedia Computing, Communications and Applications, Vol. 5, No. 4, 29, 01.10.2009.

    Research output: Contribution to journalArticle

    Cha, Jongeun ; Eid, Mohamad ; Saddik, Abdulmotaleb El. / Touchable 3D video system. In: ACM Transactions on Multimedia Computing, Communications and Applications. 2009 ; Vol. 5, No. 4.
    @article{f09afeab894f4bf68c82f0f14d6e9340,
    title = "Touchable 3D video system",
    abstract = "Multimedia technologies are reaching the limits of providing audio-visual media that viewers consume passively. An important factor, which will ultimately enhance the user's experience in terms of impressiveness and immersion, is interaction. Among daily life interactions, haptic interaction plays a prominent role in enhancing the quality of experience of users, and in promoting physical and emotional development. Therefore, a critical step in multimedia research is expected to bring the sense of touch, or haptics, into multimedia systems and applications. This article proposes a touchable 3D video system where viewers can actively touch a video scene through a force-feedback device, and presents the underlying technologies in three functional components: (1) contents generation, (2) contents transmission, and (3) viewing and interaction. First of all, we introduce a depth image-based haptic representation (DIBHR) method that adds haptic and heightmap images, in addition to the traditional depth image-based representation (DIBR), to encode the haptic surface properties of the video media. In this representation, the haptic image contains the stiffness, static friction, and dynamic friction, whereas the heightmap image contains roughness of the video contents. Based on this representation method, we discuss how to generate synthetic and natural (real) video media through a 3D modeling tool and a depth camera, respectively. Next, we introduce a transmission mechanism based on the MPEG-4 framework where new MPEG-4 BIFS nodes are designed to describe the haptic scene. Finally, a haptic rendering algorithm to compute the interaction force between the scene and the viewer is described. As a result, the performance of the haptic rendering algorithm is evaluated in terms of computational time and smooth contact force. It operates marginally within a 1 kHz update rate that is required to provide stable interaction force and provide smoother contact force with the depth image that has high frequency geometrical noise using a median filter.",
    keywords = "Haptic rendering algorithm, Haptic surface properties, Video representation",
    author = "Jongeun Cha and Mohamad Eid and Saddik, {Abdulmotaleb El}",
    year = "2009",
    month = "10",
    day = "1",
    doi = "10.1145/1596990.1596993",
    language = "English (US)",
    volume = "5",
    journal = "ACM Transactions on Multimedia Computing, Communications and Applications",
    issn = "1551-6857",
    publisher = "Association for Computing Machinery (ACM)",
    number = "4",

    }

    TY - JOUR

    T1 - Touchable 3D video system

    AU - Cha, Jongeun

    AU - Eid, Mohamad

    AU - Saddik, Abdulmotaleb El

    PY - 2009/10/1

    Y1 - 2009/10/1

    N2 - Multimedia technologies are reaching the limits of providing audio-visual media that viewers consume passively. An important factor, which will ultimately enhance the user's experience in terms of impressiveness and immersion, is interaction. Among daily life interactions, haptic interaction plays a prominent role in enhancing the quality of experience of users, and in promoting physical and emotional development. Therefore, a critical step in multimedia research is expected to bring the sense of touch, or haptics, into multimedia systems and applications. This article proposes a touchable 3D video system where viewers can actively touch a video scene through a force-feedback device, and presents the underlying technologies in three functional components: (1) contents generation, (2) contents transmission, and (3) viewing and interaction. First of all, we introduce a depth image-based haptic representation (DIBHR) method that adds haptic and heightmap images, in addition to the traditional depth image-based representation (DIBR), to encode the haptic surface properties of the video media. In this representation, the haptic image contains the stiffness, static friction, and dynamic friction, whereas the heightmap image contains roughness of the video contents. Based on this representation method, we discuss how to generate synthetic and natural (real) video media through a 3D modeling tool and a depth camera, respectively. Next, we introduce a transmission mechanism based on the MPEG-4 framework where new MPEG-4 BIFS nodes are designed to describe the haptic scene. Finally, a haptic rendering algorithm to compute the interaction force between the scene and the viewer is described. As a result, the performance of the haptic rendering algorithm is evaluated in terms of computational time and smooth contact force. It operates marginally within a 1 kHz update rate that is required to provide stable interaction force and provide smoother contact force with the depth image that has high frequency geometrical noise using a median filter.

    AB - Multimedia technologies are reaching the limits of providing audio-visual media that viewers consume passively. An important factor, which will ultimately enhance the user's experience in terms of impressiveness and immersion, is interaction. Among daily life interactions, haptic interaction plays a prominent role in enhancing the quality of experience of users, and in promoting physical and emotional development. Therefore, a critical step in multimedia research is expected to bring the sense of touch, or haptics, into multimedia systems and applications. This article proposes a touchable 3D video system where viewers can actively touch a video scene through a force-feedback device, and presents the underlying technologies in three functional components: (1) contents generation, (2) contents transmission, and (3) viewing and interaction. First of all, we introduce a depth image-based haptic representation (DIBHR) method that adds haptic and heightmap images, in addition to the traditional depth image-based representation (DIBR), to encode the haptic surface properties of the video media. In this representation, the haptic image contains the stiffness, static friction, and dynamic friction, whereas the heightmap image contains roughness of the video contents. Based on this representation method, we discuss how to generate synthetic and natural (real) video media through a 3D modeling tool and a depth camera, respectively. Next, we introduce a transmission mechanism based on the MPEG-4 framework where new MPEG-4 BIFS nodes are designed to describe the haptic scene. Finally, a haptic rendering algorithm to compute the interaction force between the scene and the viewer is described. As a result, the performance of the haptic rendering algorithm is evaluated in terms of computational time and smooth contact force. It operates marginally within a 1 kHz update rate that is required to provide stable interaction force and provide smoother contact force with the depth image that has high frequency geometrical noise using a median filter.

    KW - Haptic rendering algorithm

    KW - Haptic surface properties

    KW - Video representation

    UR - http://www.scopus.com/inward/record.url?scp=75649147702&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=75649147702&partnerID=8YFLogxK

    U2 - 10.1145/1596990.1596993

    DO - 10.1145/1596990.1596993

    M3 - Article

    VL - 5

    JO - ACM Transactions on Multimedia Computing, Communications and Applications

    JF - ACM Transactions on Multimedia Computing, Communications and Applications

    SN - 1551-6857

    IS - 4

    M1 - 29

    ER -