Group recommendations via multi-armed bandits

José Bento, Stratis Ioannidis, Shanmugavelayutham Muthukrishnan, Jinyun Yan

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Abstract

    We study recommendations for persistent groups that repeatedly engage in a joint activity. We approach this as a multi-arm bandit problem. We design a recommendation policy and show it has logarithmic regret. Our analysis also shows that regret depends linearly on d, the size of the underlying persistent group. We evaluate our policy on movie recommendations over the MovieLens and MoviePilot datasets. Copyright is held by the author/owner(s).

    Original languageEnglish (US)
    Title of host publicationWWW'12 - Proceedings of the 21st Annual Conference on World Wide Web Companion
    Pages463-464
    Number of pages2
    DOIs
    StatePublished - May 21 2012
    Event21st Annual Conference on World Wide Web, WWW'12 - Lyon, France
    Duration: Apr 16 2012Apr 20 2012

    Publication series

    NameWWW'12 - Proceedings of the 21st Annual Conference on World Wide Web Companion

    Conference

    Conference21st Annual Conference on World Wide Web, WWW'12
    CountryFrance
    CityLyon
    Period4/16/124/20/12

    Keywords

    • Group recommendation
    • Multi-armed bandits

    ASJC Scopus subject areas

    • Computer Networks and Communications

    Cite this

    Bento, J., Ioannidis, S., Muthukrishnan, S., & Yan, J. (2012). Group recommendations via multi-armed bandits. In WWW'12 - Proceedings of the 21st Annual Conference on World Wide Web Companion (pp. 463-464). (WWW'12 - Proceedings of the 21st Annual Conference on World Wide Web Companion). https://doi.org/10.1145/2187980.2188078

    Group recommendations via multi-armed bandits. / Bento, José; Ioannidis, Stratis; Muthukrishnan, Shanmugavelayutham; Yan, Jinyun.

    WWW'12 - Proceedings of the 21st Annual Conference on World Wide Web Companion. 2012. p. 463-464 (WWW'12 - Proceedings of the 21st Annual Conference on World Wide Web Companion).

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Bento, J, Ioannidis, S, Muthukrishnan, S & Yan, J 2012, Group recommendations via multi-armed bandits. in WWW'12 - Proceedings of the 21st Annual Conference on World Wide Web Companion. WWW'12 - Proceedings of the 21st Annual Conference on World Wide Web Companion, pp. 463-464, 21st Annual Conference on World Wide Web, WWW'12, Lyon, France, 4/16/12. https://doi.org/10.1145/2187980.2188078
    Bento J, Ioannidis S, Muthukrishnan S, Yan J. Group recommendations via multi-armed bandits. In WWW'12 - Proceedings of the 21st Annual Conference on World Wide Web Companion. 2012. p. 463-464. (WWW'12 - Proceedings of the 21st Annual Conference on World Wide Web Companion). https://doi.org/10.1145/2187980.2188078
    Bento, José ; Ioannidis, Stratis ; Muthukrishnan, Shanmugavelayutham ; Yan, Jinyun. / Group recommendations via multi-armed bandits. WWW'12 - Proceedings of the 21st Annual Conference on World Wide Web Companion. 2012. pp. 463-464 (WWW'12 - Proceedings of the 21st Annual Conference on World Wide Web Companion).
    @inproceedings{591dbfd06dba43be9a486ddbaf1e351e,
    title = "Group recommendations via multi-armed bandits",
    abstract = "We study recommendations for persistent groups that repeatedly engage in a joint activity. We approach this as a multi-arm bandit problem. We design a recommendation policy and show it has logarithmic regret. Our analysis also shows that regret depends linearly on d, the size of the underlying persistent group. We evaluate our policy on movie recommendations over the MovieLens and MoviePilot datasets. Copyright is held by the author/owner(s).",
    keywords = "Group recommendation, Multi-armed bandits",
    author = "Jos{\'e} Bento and Stratis Ioannidis and Shanmugavelayutham Muthukrishnan and Jinyun Yan",
    year = "2012",
    month = "5",
    day = "21",
    doi = "10.1145/2187980.2188078",
    language = "English (US)",
    isbn = "9781450312301",
    series = "WWW'12 - Proceedings of the 21st Annual Conference on World Wide Web Companion",
    pages = "463--464",
    booktitle = "WWW'12 - Proceedings of the 21st Annual Conference on World Wide Web Companion",

    }

    TY - GEN

    T1 - Group recommendations via multi-armed bandits

    AU - Bento, José

    AU - Ioannidis, Stratis

    AU - Muthukrishnan, Shanmugavelayutham

    AU - Yan, Jinyun

    PY - 2012/5/21

    Y1 - 2012/5/21

    N2 - We study recommendations for persistent groups that repeatedly engage in a joint activity. We approach this as a multi-arm bandit problem. We design a recommendation policy and show it has logarithmic regret. Our analysis also shows that regret depends linearly on d, the size of the underlying persistent group. We evaluate our policy on movie recommendations over the MovieLens and MoviePilot datasets. Copyright is held by the author/owner(s).

    AB - We study recommendations for persistent groups that repeatedly engage in a joint activity. We approach this as a multi-arm bandit problem. We design a recommendation policy and show it has logarithmic regret. Our analysis also shows that regret depends linearly on d, the size of the underlying persistent group. We evaluate our policy on movie recommendations over the MovieLens and MoviePilot datasets. Copyright is held by the author/owner(s).

    KW - Group recommendation

    KW - Multi-armed bandits

    UR - http://www.scopus.com/inward/record.url?scp=84861031435&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=84861031435&partnerID=8YFLogxK

    U2 - 10.1145/2187980.2188078

    DO - 10.1145/2187980.2188078

    M3 - Conference contribution

    AN - SCOPUS:84861031435

    SN - 9781450312301

    T3 - WWW'12 - Proceedings of the 21st Annual Conference on World Wide Web Companion

    SP - 463

    EP - 464

    BT - WWW'12 - Proceedings of the 21st Annual Conference on World Wide Web Companion

    ER -