Tuesday, 2025/11/04

  • CS/AI
  • C
  • TITECH
  • Switch Language
    • ja日本語 (Japanese)
    • enEnglish

Shimosaka Research Group

pursuing MIUBIQ (machine intelligence in UbiComp Research)

  • Home
    • Members
    • Location
  • News
  • Projects
  • Publications
  • Awards
  • Archives
    • Codes
    • Datasets
Navigation
News Omni-CityMood: Vision-based Urban Atmosphere Perception from Every Angle at SIGSPATIAL2025

Omni-CityMood: Vision-based Urban Atmosphere Perception from Every Angle at SIGSPATIAL2025

2025/11/04 | NewsPresentations | 4 views |

The 33rd ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems (ACM SIGSPATIAL 2025), one of the top conferences in the field of spatial information science, will be held in Minneapolis, USA, from November 3 to 6, 2025.

The following work will be presented at this conference.

Omni-CityMood: Vision-based Urban Atmosphere Perception from Every Angle

Abstract
Understanding how cities are perceived from on-site visitors’ perspectives can provide valuable insights for urban planning and development applications. However, existing studies estimated people’s perceptions by having them view photographed landscape images; the scores derived by these methods were thus merely quantified impressions of specific viewpoints that do not necessarily represent perceptions people would have were they at the site. To address this issue, we developed a framework, named Omni-CityMood, for quantifying people’s on-site perceptions of urban atmospheres. Based on the idea that the viewpoint influences the perception of an urban landscape, the proposed framework identifies critical viewpoints of a location by using both visual-based features of landscape images and geographical characteristics of the site. In particular, Omni-CityMood enables the mood of a location to be evaluated from viewpoints over a range of 360 degrees by leveraging the techniques of neural recommendation systems. We evaluated Omni-CityMood on a dataset we built that includes perceived atmosphere experiences in various cities. Experiments and extensive analyses demonstrate the promising capability of modeling landscape viewpoints to quantify urban on-site atmospheres.

——
Presentation Information
Date & Time: Tuesday, November 4, 2025, 16:00 – 17:50
Session: Research 3: Urban Computing and Land Use
Title: Omni-CityMood: Vision-based Urban Atmosphere Perception from Every Angle
Authors: Yuki Kubota (Institute of Science Tokyo), Kota Tsubouchi (LY Corporation), Sayaka Anno (Institute of Science Tokyo), Kaito Ide (Institute of Science Tokyo), Masamichi Shimosaka (Institute of Science Tokyo)

  • tweet

Comments are disabled for this post

Social Networks

  • twitter
  • rss

Recent News

  • Omni-CityMood: Vision-based Urban Atmosphere Perception from Every Angle at SIGSPATIAL2025 2025/11/04
  • Presenting our paper on Continuous Inverse Reinforcement Learning with State-wise Safety Constraints for Stable Driving Behavior Prediction at ITSC2025 2025/10/30
  • Presenting our paper on Exploiting Periodic UWB CIRs for Robust Activity Recognition with Attention-aware Multi-level Wavelet at PerCom2025 2025/02/15
  • Presenting our paper on revealing Universities’ Atmosphere from Visitor Interests has been presented at IEEE BigData 2024 2024/12/16
  • Our paper on adaptive incremental-decremental BLE placement optimization for accurate indoor positioning has been presented at IPIN2024. 2024/10/23
  • Presenting two papers at SIGSPATIAL 2024 2024/10/23
  • Forecasting Crowded Events using Public Announcements with Large Language Models 2024/10/15
  • Forecasting Lifespan of Crowded Events Inspired by Acoustic Synthesis Technique 2024/07/04
  • Our paper on forecasting lifespan of crowded events has been published in IEEE Access 2024/07/04
  • Presenting our paper on Stable IRL from failed demonstrations at IV2024 2024/05/30

Search

Copyright 2015 · Shimosaka Research Group at TITECH