views:

646

answers:

7

Brief description of requirements

(Lots of good answers here, thanks to all, I'll update if I ever get this flying).

A detector runs along a track, measuring several different physical parameters in real-time (determinist), as a function of curvilinear distance. The user can click on a button to 'mark' waypoints during this process, then uses the GUI to enter the details for each waypoint (in human-time, but while the data acquisition continues).

Following this, the system performs a series of calculations/filters/modifications on the acquired data, taking into account the constraints entered for each waypoint. The output of this process is a series of corrections, also as a function of curvilinear distance.

The third part of the process involves running along the track again, but this time writing the corrections to a physical system which corrects the track (still as a function of curvilinear distance).

My current idea for your input/comments/warnings

What I want to determine is if I can do this with a PC + FPGA. The FPGA would do the 'data acquisition', I would use C# on the PC to read the data from a buffer. The waypoint information could be entered via a WPF/Winforms application, and stocked in a database/flatfile/anything pending 'processing'.

For the processing, I would use F#.

The the FPGA would be used for 'writing' the information back to the physical machine.

The one problem that I can foresee currently is if processing algorithms require a sampling frequency which makes the quantity of data to buffer too big. This would imply offloading some of the processing to the FPGA - at least the bits that don't require user input. Unfortunately, the only pre-processing algorithm is a Kalman filter, which is difficult to implement with an FPGA, from what I have googled.

I'd be very greatful for any feedback you care to give.

UPDATES (extra info added here as and when)

At the entrance to the Kalman filter we're looking at once every 1ms. But on the other side of the Kalman filter, we would be sampling every 1m, which at the speeds we're talking about would be about 2 a second.

So I guess more precise questions would be:

  1. implementing a Kalman filter on an FPGA - seems that it's possible, but I don't understand enough about either subject to be able to work out just HOW possible it is.

  2. I'm also not sure whether an FPGA implementation of a Kalman will be able to cycle every 1ms - though I imagine that it should be no problem.

  3. If I've understood correctly, FPGAs don't have hod-loads of memory. For the third part of the process, where I would be sending a (approximately) 4 x 400 array of doubles to use as a lookup table, is this feasible?

  4. Also, would swapping between the two processes (reading/writing data) imply re-programming the FPGA each time, or could it be instructed to switch between the two? (Maybe possible just to run both in parallel and ignore one or the other).

  5. Another option I've seen is compiling F# to VHDL using Avalda FPGA Developer, I'll be trying that soon, I think.

+2  A: 

Since you are moving along a track, I have to assume the sampling frequency isn't more than 10 kHz. You can offload the data to PC at that rate easily, even 12 Mb USB (full-speed).

For serious processing of math data, Matlab is the way to go. But since I haven't heard of F#, I can't comment.

4 x 400 doubles is no problem. Even low-end FPGAs have 100's of kb of memory.

You don't have to change images to swap between reading and writing. That is done all the time in FPGAs.

Brian Carlton
@Brian Carlton, thanks for your answer. F# is a functional language in the .Net fold, I think the advantage over Matlab is that it is more easily distributable. Disadvantage is that you have to roll most of it yourself, and it is pretty bleeding edge.
Benjol
+1  A: 

What is your connection to the PC? .Net will be a good fit if it is a network based connection, as you can use streams to deal with the data input.

My only warning to you regarding F# or any functional programming language involving large data sets is memory usage. They are wonderful and mathematically provable but when you are getting a stack overflow exception from to many recursions it means that your program won't work and you lose time and effort.

C# will be great if you need to develop a GUI, winforms and GDI+ should get you to something usable without a monumental effort.

Give us some more information regarding data rates and connection and maybe we can offer some more help?

Spence
Well, the fpga and pc would be in the same place, so they can be connected any way you like. From what Brian Carlton said, USB sounds quite cool. Looking at data rates now.
Benjol
Point taken about stack overflow, but tail recursion is one thing that I *have* understood about F# (I think), so I hope I could avoid that particular problem.
Benjol
F# can access directly into indicies into arrays; it can have reference types and doesn't always use copy-semantics when calling. Hence we'd be passing around an int and not a copy of the array every call. Also, GDI+ is software rendered while WPF is GPU-rendered, so GDI+ is SLOWER!
Henrik
GDI calls have been hardware accellerated since windows 98, WPF is faster at doing 3d special effects, GDI is still fine for normal GUI. When I said faster I was referring to development, not execution performance.
Spence
+1  A: 

There might be something useful in the Microsoft Robotics Studio: link text especially for the real time aspect. The CCR - Concurrency Coordination Runtime has a lot of this thought out already and the simulation tools might help you build a model that would help your analysis.

Data Dave
+1  A: 

Sounds to me like you can do all the processing off line. If this is the case, then offline is the way to go. In other words divide the process into 3 steps:

  1. Data acquisition
  2. Data analysis
  3. Physical system corrections based on the data analysis.

Data Acquisition

If you can't collect the data using a standard interface, then you probably have to go with a custom interface. Hard to say if you should be using an FPGA without knowing more about your interface. Building custom interfaces is expensive, so you should do a tradeoff study to select the approach. Anyway, if this is FPGA based then keep the FPGA simple and use it for raw data acquisition. With current hard drive technology you can easily store 100's of Gigabytes of data for post-processing, so store the raw data on a disk drive. There's no way you want to be implementing even a 1 dimensional Kalman filter in an FPGA if you don't have to.

Data Analysis

Once you've got the data on a hard drive, then you have lots of options for data analysis. If you already know F#, then go with F#. Python and Matlab both have lots of data analysis libraries available.

This approach also makes it much easier to test your data analysis software than a solution where you have to do all the processing in real time. If the results don't seem right, you can easily rerun the analysis without having to go and collect the data again.

Physical System Corrections

Take the results of the data analysis and run the detector along the track again feeding it the appropriate inputs through the interface card.

billmcc
+1  A: 

I've done a lot of embedded engineering including hybrid systems such as the one you've described. At the data rates and sizes you need to process, I doubt that you need an FPGA ... simply find an off the shelf data acquisition system to plug into your PC.

I think the biggest issue you're going to run into is more related to language bindings for your hardware APIs. In the past, I've had to develop a lot of my software in C and assembly (and even some Forth) simply because that was the easiest way to get the data from the hardware.

Steve Moyer
+3  A: 

You don't mention your goals, customers, budget, reliability or deadlines, so this is hard to answer, but...

Forget the FPGA. Simplify your design, development environment and interfaces unless you know you are going to blow your real-time requirements with another solution.

If you have the budget, I'd first take look at LabView.

http://www.ni.com/labview/

http://www.ni.com/dataacquisition/

LabView would give you the data acquisition system and user GUI all on a single PC. In my experience, developers don't choose LabView because it doesn't feel like a 'real' programming environment, but I'd definitely recommend it for the problem you described.

If you are determined to use compiled languages, then I'd isolate the real time data acquisition component to an embedded target with an RTOS, and preferably one that takes advantage of the MMU for scheduling and thread isolation and lets you write in C. If you get a real RTOS, you should be able to realiably schedule the processes that need to run, and also be able to debug them if need be! Keep this off-target system as simple as possible with defined interfaces. Make it do just enough to get the data you need.

I'd then implement the interfaces back to the PC GUI using a common interface file for maintenance. Use standard interfaces for data transfer to the PC, something like USB2 or Ethernet. The FTDI chips are great for this stuff.

BenB
I guess that means labview doesn't interest you then? :)
BenB
LabView is under review as a possible alternative for prototyping.
Benjol
+2  A: 

Here is a suggestion.

Dump the FPGA concept. Get a DSP evaluation board from TI Pick one with enough gigaflops to make you happy. Enough RAM to store your working set.

Program it in C. TI supply a small RT kernel.

It talks to the PC over, say a serial port or ethernet, whatever.

It sends the PC cooked data with a handshake so the data doesn't get lost. There is enough ram in the DPS to store your data while the PC has senior moments.

No performance problems with the DSP.

Realtime bit does the realtime, with MP's of ram. Processing is fast, and the GUI is not time-critical.

Tim Williscroft