There is, and has been, a lot going on — including published papers; grants received; and new development. We’re in the process of updating the details and documentation for all of this. Some of it is already online. Please be patient. We tend to prioritize development over publishing news updates.
Please check out some of the more important things:
- The BETA of our new web based geo-visualization page here
- The beta of our pending Android based Datalogger app here.
This is being updated almost daily as we respond to feedback. Once installed it will auto-update on your phone. Some screens do not have documentation describing their purpose. Offered “as is” at the moment; a full page in the Wiki will be online shortly.
(Suggestion: send that link to your email you use on your Android phone, then open and click. This saves you from typing.)
- There is a Google Group for Sensorcast-users and one for Sensorcast-developers. If you are interested please join. Eventually announcements of updates to the blog and wiki will be automatically be sent to one or both of those groups.
Sensorcast has been invited to be a poster presenter at the DARPA/UCI Workshop on Mobile and Intelligent Sensor Networks on February 8th, 2016. This is organized by the UC Irvine Office of Research. From the announcement:
“Sensor networks are crucial for collecting data and understanding the physical world. This workshop will explore fundamental challenges in design and deployment of heterogeneous sensor networks that can benefit from mobility and intelligent distributed decision making. Distributed decision making and deployment is fundamental to large heterogeneous sensor networks as global and central algorithms are not feasible. The fundamental goals of the network deployment are improving the capacity/throughout, connectivity, coverage, bandwidth efficiency, mobility, delay, and security of the heterogeneous sensor networks.
The goal of the workshop is to foster discussion, discovery, and dissemination of the state-of-the-art in this area and identifying potential next generation breakthrough technologies.”
Updated 2016/02/09: Our poster used for the workshop can be found here.
The DECO team at the Wisconsin IceCube Particle Astrophysics Center (WIPAC) has published a second paper entitled Measurement of camera image sensor depletion thickness with cosmic rays. This paper was accepted by the Journal of Instrumentation. The full text of the paper can be found here. The abstract is below:
Camera image sensors can be used to detect ionizing radiation in addition to optical photons. In particular, cosmic-ray muons are detected as long, straight tracks passing through multiple pixels. The distribution of track lengths can be related to the thickness of the active (depleted) region of the camera image sensor through the known angular distribution of muons at sea level. We use a sample of cosmic-ray muon tracks recorded by the Distributed Electronic Cosmic-ray Observatory to measure the thickness of the depletion region of the camera image sensor in a commercial smart phone, the HTC Wildfire S. The track length distribution prefers a cosmic-ray muon angular distribution over an isotropic distribution. Allowing either distribution, we measure the depletion thickness to be between 13.9~μm and 27.7~μm. The same method can be applied to additional models of image sensor. Once measured, the thickness can be used to convert track length to incident polar angle on a per-event basis. Combined with a determination of the incident azimuthal angle directly from the track orientation in the sensor plane, this enables direction reconstruction of individual cosmic-ray events.
The DECO team at the Wisconsin IceCube Particle Astrophysics Center (WIPAC) has published a paper entitled Detecting particles with cell phones: the Distributed Electronic Cosmic-ray Observatory and is presenting at International Cosmic Ray Conference (ICRC) at The Hague, Netherlands on August 6, 2015 (webpage). The full paper can be found here. The abstract is below:
In 2014 the number of active cell phones worldwide for the first time surpassed the number of humans. Cell phone camera quality and onboard processing power (both CPU and GPU) continue to improve rapidly. In addition to their primary purpose of detecting photons, camera image sensors on cell phones and other ubiquitous devices such as tablets, laptops and digital cameras can detect ionizing radiation produced by cosmic rays and radioactive decays. While cosmic rays have long been understood and characterized as a nuisance in astronomical cameras, they can also be identified as a signal in idle camera image sensors. We present the Distributed Electronic Cosmic-ray Observatory (DECO), a platform for outreach and education as well as for citizen science. Consisting of an app and associated database and web site, DECO harnesses the power of distributed camera image sensors for cosmic-ray detection.
We’ve been looking for a simple name for some time that is more concise than Global Sensor Web. To this end we’ve settled on Sensorcast.org. Old GlobalSensorWeb.org links should work for continue to work for some time and redirect to Sensorcast.org. The links for the API, data collection and query server are already converted over.
Posted in News
Tagged GSW, Sensorcast
The John S. and James L Knight Foundation has awarded the Global Sensor Web (GSW) a modest grant to further our efforts. Ariel Levi Simons submitted our proposal via the Wildwood School in LA. Levi has been instrumental in helping us find our 1st sub-project (DECO), and finding some earlier funding for that research. Details can be found here
As we have discussed in our talks and other blog posts, noise is a large factor in creating events, and we don’t want noise, whether it be thermal, the most common, or magnetic. One of the great things about being able to actually see event images on a computer is that an event created by noise is easy to identify based off the pixels seen in the image. For example, events created by noise rarely have a small group of pixels at a higher brightness or intensity in comparison to the other pixels in the image. Events created by cosmic rays have a grouping of pixels that is significantly brighter than the surrounding pixels. We can see these clusters of pixels by altering the color range of the image. Because the Charged-couple devices (CCDs) in the cameras only use red, green(2), and blue, all we have to do is distort the color range so that those colors are more pronounced. If the color range of an image is reduced to all green, it is easier to spot the bright spot(s) in the image. If you don’t reduce the color range, the image is black and the brightened pixels are almost impossible to see. Now it’s just a process of filtering out events created by noise.
Over the past few months, many queries have been conducted and many numbers recorded in regards to pixel intensity. A simple way to eliminate events created by noise is to come up with a set minimum pixel intensity per image. While this method may not eliminate all events created by noise–and very well could eliminate lower energy cosmic ray particles–it will significantly reduce the volume of events created which makes analysis much easier. There is also a higher certainty that the events registered are cosmic rays because the filter will only show high-energy events. Determining the minimum pixel intensity is the current focus of the project on a broader scale.
In terms of the tests run in Wyoming to determine if there was a relationship between altitude and flux, please visit this section of our wiki page.
The “Big” idea behind developing the Global Sensor Web (GSW) is to create a platform for citizen science whereby various research teams can quickly aggregate and analyze data, even from real time sources such as weather stations and air quality monitors. Part of this development work now involves engaging students in helping to lay out what type of data we can capture and, more importantly, what can be done with that data once it’s captured.
This work, currently done in downtown Los Angeles at LA Makerspace, starts with a brainstorming session where some of our students have put together a table of the sensor data which can be collected from phones as well as what type of research can be done with that data. So far we have focused heavily on using the cameras in phones as a way of monitoring cosmic-rays, but with the ability to capture and analyze large amounts of geotagged data for public research we are hoping to build a platform for future citizen science.
Are you confused about what DECO really is? Why cosmic rays are important? Who we really are? Well then, come on over to the LA Makerspace in downtown Los Angeles Sunday, June 9th for an hour long briefing somewhere between 2pm and 5pm. We’ll be talking about the current state of the project, future goals, and funding! There will also be several other projects being presented about various science and education topics.
Tickets are $10 a person. However, if you are already a member of LA Makerspace or donated to the Kickstarter back in February, then you can come for free! We’ll also be serving snacks and refreshments. There’s free food. You really don’t have an excuse to not come.
RSVP and event information is here: http://lamcitizenscience.eventbrite.com/
P.S. Street parking can be scarce, so bring $5-$10 for lot parking.
Here is our set-up for running the aforementioned Cobalt-60 tests with the DECO App:
Testing the DECO app with Cobalt-60 (emmiting gamma rays) and a HTC Wildfire.
We will run with this set-up (source is 10cm away from the phone) until we get 1,000 events.