Project Atanua analyzes photographs, taken by cameras, as well as smartphones. The ambient daylight color, the sky color, and/or other atmospheric optical phenomena - whichever is available - is analyzed to draw conclusions about relevant parameters, including that of the Aerosol present in the Atmosphere within the Frame of the picture. Name : From the polynessian dawn Goddess. Dawnbreak makes the sky burst in colours.

This project is solving the My Sky Color challenge.


This project aims to record the sky color. The project consists of the following segments:

  1. Sensor Segment :

Mobile Telephones often come with cameras. However, they need to be calibrated. An usual calibaration method is to use a cyanometer, or a colorchecker. An uncalibrated image would not provide any useful information.

In astronomy, the method of CCD calibration using dark, flat, and bias frames are not relevant here, because we are looking for color, i.e. signal strength with respect to wavelength, and not for existance of faint point sources, or identification thereof from noise. Nor is, in contrast the Forest Fuel App project [1], asking users to estimate parameters a viable option. While user estimate canopy cover can be cross checked by a computer using robust contour identification techniques, to the best of the participants' knowledge, no computer vission algorithm can correctly estimate the color of an unknown target (in this case the aerosol composition is being searched for, it is thus unknown) without a camera calibration. Previous researchers have proposed prebuilt / precalibrated spectropolarimeters [2]. We go with a standard color target.

Together with the picture, the location and the smartphone orientation is measured. The workflow is now transfered to the next segment.

  1. Capturing and Transfer segment :

Different mobile OS-s support different App framework. As a compromise, we select the HTML5-JS technology. While is Firefox OS, JS can directly access the camera parameters, the possibility remains open to implement, e.g., a javascript interface to the Native Java API of the Android Platform. Thus HTML5-JS gives us the maximum flexibility as well as possibility to expand the App in future. The same app transmits the picture to a server. In our case, a flatpress CMS is used. Android, as well as Chrome for Android can simply visit a web-page to call an web app. Even a desktop user can visit the webpage and upload a picture, if the other parameters are known.

  1. Analysis segment :

Analyzing the images on board the camera itself may seem viable. However, it is always better to transfer the data in rawest possible state to the analysis system. As the server is populated with images, we can also implement security measures. Should we nonetheless in course of development find out that on-board processing is indeed a good option, we can always add a JS script to our Web-App, and the clients will be able to perform the aditional task without having to worry about software synchronization.

For analysis, we can always use the SORCE data, updated daily, more precisely, the SIM dataset, which covers the visible band [3]. The upper atmospheric attenuation can be computed from standard models. Then we can compute the following parameters :

1 Turbidity, described in [4], and by accessing the blue channel only, 2 Blueness of the sky (comparable to cyanometer readings). 3 Cloud Illumination, see [5] and [6] for the importance of this. 4 Spectral Signature of attenuating Species, and intensity

[1] : [2] : [3] : [4] : R. Dogniaux (Ed.), Prediction of Solar Radiation in Areas with a Specific Microclimate, Kluwer Academic Publishers, 1994 [5] : Martins, [6] : Wilcox,

Project Information

License: GNU General Public License version 3.0 (GPL-3.0)

Source Code/Project URL:


Main Site -
Android App -


  • Shane Kirkbride
  • Sayandeep Khan