Sunday, November 14, 2010

PCM Spectrum Analyser and Octal FSK Decoder / Encoder

This was a prototyping project to see if I could capture PCM (pulse code modulated) data from my sound card - where the sound was encoded using an octal FSK (frequency shift keyed) method and then decode it. The application displays the captured waveform (amplitude / time domain), a frequency /time domain plot and frequency components using a fast fourier transformt (FFT) of a sampled portion of the original waveform. It also has a waterfall type frequency display which is generated from iterative FFTs over the complete waveform. Not only can the application capture and decode the PCM data - but it can also use an inverse fourier transform (fourier synthesis) to encode data to PCM. It even includes a method to synthesize noise on the generated PCM waveform. The intention is that this prototype will form the basis of an octal FSK encoder / decoder service that allows multiple client applications on a network to send and receive data over a radio channel. Developed in C# making use of DirectX DirectSound and the Exocortex.DSP (http://www.exocortex.org/dsp/) digital signal processing engine.

Saturday, November 13, 2010

Software Defined Radio and PSK31 Decode using Digipan

I recently purchased a WinRadio Excalibur Software Defined Radio (SDR). This very cool radio can scan and analyse radio signals across the full 0.00 - 50.00 MHz range - and therefore makes great wide spectrum analyser! It supports AM, AMS, FM, LSB/USB (SSB), CW, DRM, FSK and UDM modulation.

For a while I listened to amateur bands (particularly 20 and 15m) and recorded "QSOs" in my own QSO logging software (which I will feature on this blog at some stage soon) - however I quickly became interested in decoding digital modes in the amateur bands (i.e. PSK31 etc).

The screen capture above shows the SDR recieving PSK31 and decoding using a free software application called Digipan. To set this up I also had to download VAC (Virtual Audio Cable) to route the audio to digipan for decoding.

Saturday, October 16, 2010

CueSo (Advanced Amateur Radio QSO Logging Software)

After the purchase of my WinRadio Software Defined Radio (SDR), I started listening to a lot of radio amateur traffic on the 20 and 15m bands. I use a very compact active antenna which is only about 40cm long mounted vertically off the side of my apartment balcony here in London. I was interested to see what sort of range and performance I could get using this setup (which is less than desirable) and wanted to be able to plot these on a map and perhaps do some basic spatial analysis on received signal strengths etc.

Obviously my first stop was good old Google maps to plot out the locations of the stations I was recieving. QRZ.com is a great site that lets you get the QTH (location) of stations given their call sign - and this is what I used to determine where the stations were. Unfortunately Google maps gives misleading results if you are interested in the location of stations relative to your own. This is because Google maps uses an innaproproate map projection to display locations given in geographical coordinates (Latitude and Longitude). Azimuth and distance are distorted using the Mercator projection under Google maps. A polar projection centered at the receiving station is more appropriate for this type of map display.

So my second stop was to load up the data in ArcGIS (Geographic Information System) and choose a polar coordinate system centered at my appartment. I was intending to put some sort of nice interface on top of ArcEngine to allow me to add and query the logged stations more easily - however I found ArcMap so slow that I decided it would be better just to write my own display engine and implement the QSO logging software from scratch. This would also give me more creative control over what I wanted to do.

CueSo is the result of this software development effort (developed in C#):

Fig 1: Screen Capture of CueSo in Map Display and Spatial Analysis Mode.

The software allows you to:
  • Leverage the QRZ.com database to quickly search for stations given call sign (CueSo consumes an XML based service provided by QRZ.com).
  • Data entry utilises as much auto-complete and look-up intelligence as possible. This is to simplify and speed up an otherwise tedious task.
  • Recorded and log QSOs based on the ADIF standard fields (Amateur Data Interchange Format).
  • Query the QSO database using SQL (standard query language) - allows for complex queries to be performed.
  • Display QSOs in a polar map display. Map display options are configurable.
  • Choose from a number of geodetic engines to perform the map projections (Half-Versed-Sine (haversine), Spherical Law of Cosines, Vincenty, Redfearn's Formula).
  • Maidenhead locators (grid squares) are calculated on the fly and are displayed along with geographical coordinates.
  • Perform spatial analysis on received signal strengths using an anti-aliasing kernel over the QSOs to generate a raster surface which can be normalised by distance to station and radio path density. The raster engine is my own "RasterSurface" engine which I will feature in this blog at some stage in the future).
  • Make simple graphs from the QSO database (countries contacted, contacts over time, frequency utilisation etc.).
  • Pull space weather information (important for radio propagation conditions) from the NOAA Space Weather web service.
  • Make audio recordings of the QSOs and play them back at a later date. I did this using the DirectX Direct Sound libraries - but have also included an option to use low-level WAV capture using code from Ianier Munoz (avoiding the need for DirectX).
Screen Captures:
Fig 2: Logging on to QRZ.com Services.

Fig 3: CueSo displaying QSOs in tabular form with an SQL query applied. Left panel is the data entry control.
Fig 3.5: Perform advanced SQL queries using the query builder.

Fig 4: CueSo's map display and various display options. You can change colours, fonts, layer displays etc. And it's much faster than ArcGIS!

Fig 5: CueSo's GIS-like functionality showing Identify (point-and-click) and Search results.

Fig 6: Spatial analysis of signal strength received normalised by distance to station.

Fig 7: Graphing options (Countries contacted in this case). Also shows the list of available graphs that can be generated.

Fig 8: 3 day satellite environment plot from NOAA showing electron / proton flux, magnetometer and estimated planetary K values.

Fig 9: Latest auroral activity plot for the north pole (from NOAA).

Fig 10: Showing audio play back from a recorded QSO.

I want to add wave propagation modelling to the software at some stage in the future. The modeling will utilise the ITS Irregular Terrain Model algorithm for direct wave propagation - but I also hope to include ionospheric and satellite propagation modeling modules too.

Friday, April 2, 2010

Modelling Geoidal Undulation and Reducing Ellipsoidal Elevations

I was once faced with difficult geo-referencing problem. We had just acquired a rather extensive high resolution aerial photography and LIDAR campaign in an area that only had datum transform parameters that were acurate to 5m and no adequate geoidal model that could be used for reduction of ellipsoidal elevations to local. All data was delivered referenced WGS84 (ITRF 2008) and LIDAR data was in processed, ungridded, point cloud form (60 billion points). We needed:



  1. A model for vertical reduction from ellipsoid to local.

  2. An automated means of applying the vertical corrections.

  3. An automated means of gridding the LIDAR data.

  4. An automated means of re-projecting and applying higher precision horizontal transformations.

I determined the ellipsoid separation model by obtaining information from survey control stations where the local stations had also been occupied by GPS and those observations were available. These were few and far between and an even distribution across the survey area was highly desirable. In the end I managed to find 9 suitable primary control points and 4 secondary points that could be used for densifying the model.


Fig 1: Shows primary control points (large red cross) and 4 additional secondary control points used for densification purposes from benchmarks of various topographic surveys (smaller red cross).


A polynomial regression surface was fitted to the separation values from the control points using ArcGIS Geostatistical Analyst software. The regression fit was also measured. Fig 2: Local Polynomial Regression Parameters in Geostatistical Analyst.



Fig 3: Geostatistical Analyst showing the regression function and fit to the input control data. The model fit had an average error of 6mm.


The regression surface was output at a suitably high resolution to a raster dataset. To automate the reduction and gridding of the point cloud data I turned to my trusty Raster Engine which I modified so that the binning function could also accept a "correction" surface. I also added an interpolation method so that no "stepping" could be observed at the boundaries of the regression surface's cells on the final gridded LIDAR data. The LIDAR data had been delivered as several hundred tiles (for ease of data handling). The Raster Engine processed all tiles (60 billion points) with binning and vertical reduction in a matter of hours (without parallel processing).Fig 4: 3D Visualisation of 3 processed LIDAR tiles in Priority Area 2. No “stepping” effects can be seen even under a vertical exaggeration of 25x – again validating the success of the sub-pixel interpolation of the separation grid.

A number of secondary survey points (34) were reserved (i.e. not part of the regression model input) as a means of checking the validity of the model and the resultant reduction. During comparison these showed an average error of 2cm.

Improving the horizontal accuracy was achieved by comparison of the reprojected high resolution aerial photography with surveyed as-built features. A conformal transformation in grid space was derived for the entire survey area. Reprojection and application of the conformal transformation was automated using the ESRI re-projection engine. The entire dataset was processed in one weekend on a single workstation.