From IEEE Spectrum online:
Sensors are starting to prove themselves in the biggest, most complex bridges, but the technology isn’t ready for the hundreds of thousands of smaller ones
The 2.9-kilometer Rion-Antirion Bridge in Greece, with its 300 sensors, is a testament to how smart a piece of infrastructure can be. It routinely tells operators when an earthquake, frequent in those parts, or high winds warrant shutting down traffic.
“The bridge tolls are meant to collect thousands of euros per day,” says Alexandre Chaperon, an engineer at the company that designed the system, Advitam, in Vienna, Va. “Without the monitoring system, the bridge would be closed after every earthquake, more than three days in some cases, instead of 5 minutes.”
Dozens of the largest and most complex bridges in the world are already studded with strain and displacement gauges, three-dimensional accelerometers, tiltmeters, temperature sensors, and other instruments. They are wired to central data-acquisition units—though some newer bridges have wireless systems—which collect and analyze the information and relay it to engineers, in hopes of catching signs of distress before human inspectors could. With the United States injecting US $27.5 billion into revamping the country’s roadways and bridges as part of an $800 billion economic stimulus effort, it might seem like a perfect opportunity to add smarts to more bridges.
Rad the complete article here.
World’s leading low-power wireless networking standard adds new specification featuring seamless integration with global IT networks to portfolio
SAN RAMON, Calif., April 27 /PRNewswire/ — The ZigBee Alliance, a global ecosystem of companies creating standardized wireless solutions for use in energy management, commercial and consumer applications, today announced it will incorporate global IT standards from the Internet Engineering Task Force (IETF) into its specification portfolio of low-power wireless networking standards. This move will expand the growing portfolio of successful ZigBee specifications and should further advance the rapid growth of Smart Grid applications that have widely adopted the proven ZigBee Smart Energy public application profile.
For more info click here
From Roger Meike’s blog: The upcoming release of Sun SPOT software (Red v5.0) will feature a new robotics simulator built into to Solarium. It allows you to use virtual Sun SPOTs to program virtual robots using the Sun SPOT Java Development Kit. The great thing about it is you can download it and use it for free right now! Even if you don’t have Sun SPOTs.
The simulation roughly mimics the environment that is used in a competition held in San Diego each year called IARoc sponsored by Wintriss School. In this competition students use Sun SPOTs to control and iRobot Create to navigate a maze. The Create is very much like its more popular sibling, the Roomba except it doesn’t suck… by which I mean it has no vacuum… (Roombas are great, BTW) The combination of a Create and a Sun SPOT is fantastic for learning the basics of robot motion and navigation. In the simulation you write software for a Sun SPOT, and then in Solarium, create a new Robot View and Add a Robot. This will create a Robot/VIrtual Sun SPOT combination. Back on the Grid View, you will now see a Virtual SPOT. You deploy and run your software on this Virtual SPOT and it will control your simulated robot. You have your choice of three different environments to run in; an empty room, a maze, or an obstacle course. Each view includes an ‘X’ as a starting point and a ‘O’ as an ending point. Your robot includes sensors that allow you to sense when your robot is over one of these marks. We also provide a sample application that can find the center of the room, or if you poke around in the code a little, you’ll see a simple wall follower. Its just enough to get you started.
From Gizmodo, here is an article about the usefulness of WSN:
You know those guys (and gals?) who are just, like, super proud of their farts? Thanks to this cool guy and Twitter, these assholes can indulge their disgusting habit without wrecking our noses.
Known Gentleman Randy Sarafan decided to make this office chair to help “accurately document and share [his] life as it happens,” which is as admirable a cause as there ever has been to open a Twitter account. The setup is surprisingly complex: A natural gas sensor does the sniffing; an Arduino does the thinking; an Squidbee wireless module does the communicating; Twitter does the sharing. It’s a feat, to be sure.
If you, like 131 others (and counting!) feel the need to follow the goings-on around Sarafan’s anus, you can follow his tweeted tweets here. Alternatively, you can do the project yourself—it’s open source and a build tutorial is on Instructables, thank god. [Instructables via Make]
From The Web of Things:
WOT is not a technology. It is not a standard either. WOT is a vision and a community.
This is maybe the most important aspect of WOT. We tend to radically position our work on a higher level than the current Internet of Things and alike. For us, the Internet of Things is to connect devices together over the Internet. Great! Wow! So what? Why should we care?
Networked objects have never been (and should never be) about just connecting things together. It’s about why we need to connect things, and most people ask us “but why do you guys want to connect your fridge with your toaster, what’s the point? Why would you want to do that?“. I seriously don’t know, and I really don’t care! And if somebody asks me that again, I’ll slap him in the face, I promise! That’s as simple as that. I also have no idea of why would anybody connect a dildo with the RSS feed of the vibration sensor on a volcano in Vanuatu. But I’m sure there’s a lucky girl (or boy) out there who knows! And if she can’t hack it herself, then she’ll never have a volcano-linked dildo, and that’s no good. That’s exactly the kind of people we care about (not the sex freak, but the average Bob or Alice), people who want to connect something to something else! That’s what the Web of Things is all about! People who know just a little bit about of computers and would like to do much more with them to create new things that nobody thought of before without a PhD in computer science. Technology today just sucks because people who build things are too selfish to care about the users, and because of that, in the end most things out there are way to complex to do what they were supposed to. Technology is so not plug & play, and we can change that because the technology to do it is out there and works well enough for my parents (but maybe not for guys playing with the LHC).
You can read the entire post here.
The integration of sensors with social networks will lead to real-time data and more useful web apps.
In recent posts we reviewed an MIT experiment called WikiCity, that gathered real-time location data from mobile phones in Rome and graphically mapped trends from it. We then looked at a more commercial product doing similar real-time location data analysis, called Citysense. That product aims to let users find the most popular night spots in San Francisco and the most efficient ways to get to them. The next stage of projects/products such as Wikicity and Citysense will be to enable users to social network, using data from sensors as one input.
More info here
A new, simpler programming language for wireless sensor networks, written with the novice programmer in mind, can be used by geologists for monitoring volcanoes and biologists who rely on them to understand birds’ nesting behaviors. Finding an embedded systems expert to program a sensor network is difficult and costly and can lead to errors because the person using the network is not the person programming it. The cost and disconnect associated with the situation means these networks aren’t being used to their full potential.
Lan Bai, U-M doctoral student in electrical engineering and computer science, will present a paper on the new programming languages on April 13 at the Conference on Information Processing in Sensor Networks in St. Louis.
More info here.
At CTIA, one of the biggest telecom industry conferences, former vice president-turned-cleantech-investor Al Gore told wireless executives in the audience last Friday that wireless technology will be one of the key tools used to fight climate change: “This is one of those rare times we all agree that the government needs to build out a green infrastructure that will free us from foreign oil and draw on clean energy.” It’s one of the themes we touched on at our recent Green:Net conference.
Wireless sensor networks and communication networks placed on the grid will help utilities monitor and control the flow of energy better and more effectively address power outages. At the edge of the grid, consumers will use wireless networks to better manage their energy consumption. Investors are starting to make more investments in these wireless technology pieces: Just today Ember told us it has raised $8 million to help it deploy more of its wireless sensor network technology. Traditional telcos, too, like AT&T are also repositioning themselves to sell into the smart grid and AT&T says it is working with smart meter maker SmartSynch to provide its wireless network for residential installations.
More info here.
Building on last year’s successful event, the Wireless Sensing Interest Group (WiSIG) of the Sensors & Instrumentation Knowledge Transfer Network (SIKTN) is organising a Demonstrator Showcase to provide an opportunity for industry and academia to display technology, platforms, and realistic applications of wireless sensing. The purpose of the event is to raise awareness of the current state of the art and encourage future collaborations among the exhibitors and attendees.
We are inviting innovative demonstrations from industry and academia which will be classified into two categories:
a) R&D systems and b) Commercial products. We welcome both wireless sensing systems developed as ‘instruments to enable scientific investigation’ and also ‘as solutions to known problems’. Demonstrators need to be end-to-end systems with achievements beyond the state of the art at one or more of the following levels: physical level, sensing level, data fusion, middleware, communications, information extraction, and user interfaces.
|Demo submission deadline:
||15 May 2009
||1 June 2009
|Camera ready version:
||15 June 2009
||15 June 2009
||2 July 2009
Registration is free of charge. More info here.