Intelligent spaces based on many networked sensors have been studied in recent years. Generally, it is not realistic to distribute and calibrate many fixed sensors in the spaces. In this study, mobile sensors that humans or mobile robots have such as wearable cameras are considered as networked sensors of the intelligent spaces. Such mobile sensors are effective for adaptive construction of intelligent spaces. The proposed mobile sensor builds 3D maps around human while human moves around the environments. Then, positions are estimated and tracked using the built map. On the other hand, a mobile robot generates the environmental maps and estimates self-positions using LRFs. That means that the robots can be also regarded as networked sensors of the intelligent spaces. This paper calls the proposed system with networked mobile sensors 'Mobile Intelligent Space'. This paper shows details of map building and position estimation for humans and robots. Results of these map building are also shown.