FIELD: transporting.
SUBSTANCE: location information of objects of images photographed by a camera and location information of display objects, such as a building included in the map data are calculated based on location information of the camera. The calculated location information is used to match the objects with the display objects. The image photographed by the camera is displayed on a screen. Text information of display objects matched to the objects is read out from the map data and is displayed on the locations of the objects displayed on the screen. If a user moves, a travel route is guided using the matching information between the objects and display objects, so the user can be guided with the travel route while personally checking objects such as a building.
EFFECT: accomplishment of enhanced precision navigation tasks.
32 cl, 8 dwg
Authors
Dates
2008-05-10—Published
2006-06-13—Filed