tags:

views:

1218

answers:

12

This is marked as a subjective question, I hope I won't get too many down votes though.

LV seems to offer a nice graphic alternative to traditional text based programming. As I understand, it's not a just-virtualization/data acquisition programming language. Nonetheless, it seems to have that paradigm pegged to its creator's name.

My question comes up because it doesn't seem to be widely used for multi-purpose applications. I'm not a LV-expert of any kind, I'm more like a learner. I'm still getting used to LV.

+8  A: 

My two experiences with "graphic alternative[s] to traditional text based programming" have been dreadful. I find such languages to be slow to use, hard to edit, and inexpressive. Debugging them is a nightmare. And they offer no real advantages.

To be sure, it has been quite a long time since I looked at one, but the opinions of others I've asked about them have been only luke warm, so I have never taken the time to look again. Reasons to look again are welcome and will be taken on board...

dmckee
I guess this is an honest and literal answer to the question, but I don't see that it adds much to a debate about the merits of LabVIEW.
nekomatic
+4  A: 

I thought LabVIEW was a dream for FPGA programming. Independent executable blocks just... work. In general, I use LabVIEW for various tasks interfacing with my DAQ and FPGA hardware, but that's about it. It seems (again to me) that this is LabVIEW's strong point and the reason it was built, but outside that arena it feels "cumbersome." As far as getting things done, it's like any other language with a learning curve - once you figure it out it's not too bad for getting work done. I've seen several people give up before that thinking the learning curve was permanent or something.

Picking up a 30" monitor made a huge difference.

I know one thing that people dislike is the version control integration.

Edit: LabVIEW/hardware is hella expensive for "just for fun" use. I dropped $10K on their hardware (student prices) and got the software for free from school for making toys around the house.

280Z28
"hardware is *hella* expensive" AFAIK that is true of all decent DAQ gear.
dmckee
A: 

I do use LabView at home, as it is part of Lego Mindstorms, which my son loves. And I really like the way to compose systems like this.

However, in my work (embedded systems), it is generally to restrictive. But also here, I'm trying to move up in abstraction: - control and state behavior: Model based design (i.e. Rhapsody) - data algorithms etc. Simulink

Sometimes a graphical model can require more clicks than a piece of code. But this also includes the work a good programmer need to do in design & documentation; not just the code typing. The graphical notation takes many hassles away and is generally much faster if the tool is powerful enough for the complexity at hand. So I expect these kinds of tools will gain more popularity in the next years as they mature and people get familiar with them.

Adriaan
+6  A: 

Labview is fantastic if you have National Instruments hardware, and want to do something like acquire, plot and log the data.

When you start interfacing to custom devices the wiring between modules gets complicated having to do all the string manipulation work for input and output to a device.

At my place of work, we found that we got annoyed with having to make massive, complicated VI's to interface to devices and started writing them in .NET and interfacing them to Labview.

In the end we ended up scrapping Labview all together and using the NI Measurement Studio for Visual Studio to give us all the lovely looking NI controls (waveform plot, tank, gauges, switches etc) with the flexibility of C#.

In summary, even with a couple of 24" screens, sometimes the wiring for Labview code can get too complex and becomes impossible to comment, debug, and make extensible for any future changes. I suggest taking a look at Measurement Studio for Visual Studio and using your favourite .NET language with the pretty NI controls.

Fuzz
If "the wiring between modules gets complicated when you start interfacing to custom devices" it sounds as if you haven't been organising and encapsulating the different levels of functionality within subVI's. As others have noted, it's easy to make a mess with LabVIEW if you do it badly - but it's not that hard to do it well if you take a little trouble to learn.
nekomatic
I have tried doing this in several ways Nekomatic, but there really is no way to make a set an event, with a look inside it, that has a few conditions inside that to look pretty.Most of my issues come with maintaining Labview "code" made by scientists with little coding experience to get their job done (because that is how it is marketed), and without care and attention their wiring just gets crazy very fast!
Fuzz
If I understood your attempt to explain what you were trying to do I might be able to help. I thoroughly agree with you about inexperienced coders making a mess - but that doesn't mean LabVIEW isn't a good tool for people who want to write good code.
nekomatic
Most of the applications I have seen that frustrated me are controllings rigs with a few RS-485 devices and perhaps an NI acquisition system.What I have found is that having GUI elements so that they dont lock up and are still responsive, but also making sure you had deterministic timing with your control system and measurements became more complecated in Labview than it was in a text based programming language.That said, I have been a C++/Java/C# developer a lot longer than I have worked with Labview, so I prefer to see it layed out in code rather than symbolically.
Fuzz
If I am without my writen languages, I found that the best way is to write the system in distinct modules, and then use the interprocess communication VIs to pass messages and triggers between the elements.
Fuzz
That sounds like exactly the right approach: separate user interface code from control logic from I/O and use techniques such as queues, notifiers or functional globals to pass data between them. A big plus of LabVIEW is that once you do this, the scheduling of the different threads is done for you automatically.
nekomatic
+3  A: 

We use LabVIEW for running our end of line test equipment and it is ideal for data acquisition and control. Typically measuring 15 to 80 differential voltages and controlling environmental chambers, mass flow controllers and various serial devices LabVIEW is more than capable.

Interfacing with custom devices can be simplified greatly by using the NI instrument driver wizard to create reusable VI's, interfacing with custom dll's if needed. On a number of projects we have created such drivers for custom hardware and once created there are reusable in future projects with no modification.

Using event driven structures user interfaces are responsive and we regularly use LabVIEW applications to interface with a database.

Whatever programming environment you choose it's the process of designing the application that matters most. I agree that you can create some really horrible and unreadable block diagrams in LabVIEW but then you can also create unreadable code in Visual studio. With just a little thought and planning a LabVIEW block diagram can be made to fit on a single 24" monitor with plenty of space to add comments.

I would use LabVIEW over Visual Studio for most projects.

Swinders
A: 

I have been using LabVIEW for about two years for developing automation. If given due care and proper design we sure can develop maintainable and really good looking application in LabVIEW.I think this is the same for all the other languages out there. I have seen equally bad code in LabVIEW primarily from people who use it only to develop quick and dirty working automation. IMHO Graphical programming is a lot easier to code and understand if rightly done. But that said I feel text based programming 'feels' more powerful! LabVIEW is primarily marketed for industrial automation, has inherent support for lot of NI hardware and you can get the third party hardwares working with it pretty quickly. I think that is the reason you see it only in automation field. Moreover it is pretty costly and you are locked down with NI as you do cannot even open your code if you do not buy the software from them!

Manoj
+2  A: 

Our company is using LabVIEW for the last 10 years for measuring, monitoring and reporting of our subject (trains).
Recently we have started using LabVIEW as GUI for databases with lots of data, the powers of LabVIEW with the recent new features (Classes, XControls) allows use to create these kinds of GUIs for a fraction of development costs at other platforms. While we don't need external programmers at consultancy rate.

Ton

Ton
A: 

But people do use LabView for purposes other than data acquisition and virtualization. Of course LabVIEW is mainly used in labs and production environments because it is (or was) one of the main NI's customer target.

However you can do a lots of various things with LabVIEW, like programming a robot that would perform a lot of image analysis, and then tweet the results. Have a look at videos from NI Week 2009 on you-tube, and you'll see how powerful this tool is. For instance, there is possibility to write code and deploy it to ARM MCUs (see this Dev Monkey article from 2009.08.10).

And finally check this LabVIEW DIY group

Jakub Czaplicki
Thing is, I've seen a lot of neat things done with the wrong tools. The question is not whether LabView can be used for X, but whether it would be a good idea to do so if one had other tools.
David Thornley
+1  A: 

I've been thinking about this question for decades (yes, since 1989...)

Like all programming languages, LabVIEW is a high-level tool used to manipulate the flow of electrons. Unless you are a purist and refuse to use anything other than a breadboard and wires; transistors, integrated circuits and programming languages are probably a good thing if you wish to build something of any consequence.

But like all high-level tools, just wielding one does not make you a professional craftsman. Back in the day of soldering irons, op-amps and UARTs it required a large amount of careful study before you could create a system that actually functioned. The modern realm of text-based languages is so overly dominated by syntax that the programmer must get it just right before it will compile and run. In order to write code that works, the programmer must increase their skill level to create systems much larger than "Hello World".

LabVIEW is not dominated by syntax, but by Data Flow. Back in the day, reaching for your flow charting template and developing the diagram of a well-balanced information system was the art and beauty part of the job. Only after you had the reviewed flowchart in hand would you even consider slogging through the drudgery of punching out the code. (yes... punch cards)

LabVIEW is a development system that allows the programmer to use flow charting tools to diagram the complete information system and press "run"..... LabVIEW "punches out the code" and compiles it for you. No need to fight through the syntax of text language A or language B.

With such a powerful tool, novices can build large, working programs rapidly -- implying some level of professional craftsmanship since it runs at all. However, if the system does not perform elegantly, or the source code diagram is a mess, it is not the fault of LabVIEW.

People often point to "LabVIEW is only good for developing large data acquisition systems." Perhaps those people should consider the professionalism of the scientists and engineers that are working in data acquisition. If they know enough to get the actual wires right for the sensors and transducers, it may be a good bet that they are expert at developing LabVIEW wiring diagrams as well.

williamlweaver
+1  A: 

Labview can be used to author large, complex software projects. Labview is unquestionably much more fun to use than a syntax based language. I have programmed mathematically dense, dynamic simulations using labview. Newer versions of Labview include alot of exciting features, especially for utilizing multiple processors. I like Labview very much. But I don't recommend it to anyone.

Unfortunately, it's an absolute nightmare for anything other than simple acquisition and display. It may one day be sufficiently developed to be considered as a viable alternative to text based languages. However, the developers at NI have consistently opted to ignore the three fundamental problems that plague labview.

1) It is unstable and riddled with bugs. There are thousands of bugs that have been posted to the labview support forums that are yet to be fixed. Some of these are quite serious, such as memory leaks, or mathematical errors in basic functions.

2) The documentation is atrocious. More often than not, when you look for help with a labview function in the local help file you'll find a sentence that merely restates the name of the item you are trying to find some detail on. e.g. A user looks up the help file on the texture filter mode setting and the only thing written in the help file is "Texture Filter Mode- selects the mode used for texture filtering." Gee, thanks. That clears things right up, doesn't it? The problem goes much deeper in that; quite often, when you ask a technical representative from national instruments to provide critical details about labview functionality or the specific behavior of mathematical functions, they simply don't know how the functions in their own library work. This may sound like an exaggeration, but trust me, it's not.

3) While it's not impossible to keep graphical code clean and well documented, Labview is designed to make these tasks both difficult and inefficient. In order to keep your code from becoming a tangled, confusing mess, you must routinely (every few operations) employ structures like clusters, and sub-vis and giant type defined controls (which can stretch over multiple screens in a large project). These structures eat memory and destroy performance by forcing labview to make multiple copies of data in memory and perform gratuitous operations- all for the sake of keeping the graphical diagram from looking like rainbow colored spaghetti with no comments or text anywhere in sight. Programming in labview is like playing pictionary with the devil. Imagine your giant software project written as a wall sized flowchart with no words on it at all. Now imagine that all the lines cross each other a thousand times so that tracing the data flow is completely impossible. You have just envisioned the most natural and most efficient way to program in labview.

Labview is cool. Labview is getting better with each new release. If National Instruments keeps improving it, it will be great one day as a general programming language. Right now, it's an extremely bad choice as a software development platform for large or logically complex projects.

truthisnotbeauty
A: 

I have used LabView for some 10 years. It's brilliant for Scientific prorgamming ie like Matlab or Simulink but 10 times better. If you are having problems then you are doing something wrong. It takes time to learn like any language. As for using .Net instead - are these people even on the same planet? Why would you go to the trouble of writing eveything from scratch when you can say pull up an FFT etc and use alread written code. .NET is fine for simple programs but not so good for Scientific processing. yes you can do it but not without oodles of add-ons for graphics etc. Prorgamming in G is far easier than text based for Scientific problems. You can of course program in c if you are interfacing and use the dll. Now there are things that I would not use LabView for - speech recognition for example may be a bit messy at present. More to the point though, why do people like programming in outdated text form when there is an easy alternative. It is as if people want to make things complicated so as to justify their job in some way. Simplify Simplify!

Tom
@Tom: Good points. The case you are making is the same case that .NET developers make to C++ developers. Why make things complicated, write them from scratch, when so many things like events, garbage collection, code generation, etc. are done for you. However, we should all remember that, sometimes, you really need that low-level, gritty access under the covers.
Jeff Meatball Yang
A: 

Somebody said that LabView is only sued in the Automation field. Simply not write at all. It has applications in Digital Signal Processing,Control Systems,Communications, Web Based,Mathematics,Image Processing and so on. It started as a data aquisition method and they invented the name Virtual Instrumentation but it has gone far beyond that now. It is a Scientific programming language with a second to none graphical interface. It is way beyond Simulink and if you like Matlab then it has a type of Matlab scripting built in for those that like such ways of programming. It is evolving all the time. The one thing I found difficult was writing code for the Compact Rio - tricky but far easier than the alternative. It's expensive but you get a quality product. I personally have not found any bugs in ordinary programming. It is an engineers language but anybody could use it to program.

Tom