This is an old post that has appeared on expressionflow.com on May 9, 2007
From the beginning, the LabVIEW developers recognized the need for some form of node or interface to allow LabVIEW users linking in their own code that could be created outside of LabVIEW. The reasons for wanting this can be varied. Someone might be interested to interface to an existing application or code library already developed and tested in order to leverage its functionality. Other reasons could be about interfacing to third party drivers and libraries or some optimized code, which needs to be executed for performance reasons.
This first installment about external code in LabVIEW will try to bring a historical overview about the possibilities to interface to external code in the different LabVIEW versions.
LabVIEW 2, MacOS only, external code resources
This version of LabVIEW did only run on Macintosh computers. In the Macintosh 68k OS, executable code was really just another resource inside the resource fork of the file among the other resources such as icons, images, menus, text strings etc. LabVIEW therefore used this technology to implement an external code interface. The compiled object code was added to the LabVIEW VI simply as a code resource in addition to the code resource created by LabVIEW from its diagram code. Whenever the VI got loaded into memory the compiled VI code resource and the additional external code resources were linked into the existing code base.
LabVIEW 3, CINs, complications and reasons
LabVIEW 3 (we leave out 2.5 as that was only really a 3.0 prerelease) was the first version to support other operating systems than Macintosh OS. It is not fully clear to me if shared library support was already standard in the Sun OS (later to be called Solaris 1) operating system at that time. The first versions of shared library support under Unix were however in general quite buggy and each Unix vendor had his own ideas about how it should work. The Macintosh OS also did not have a standard interface for shared libraries. One had to download an extension for this and the first version for shared library support was at some point discontinued and replaced by a different incompatible version. Here too, were a number of nasty bugs present that got only slowly ironed out with new releases.
But the most difficult platform was the DLL interface for the Windows version. It had some serious troubles due to the fact that LabVIEW was a flat 32bit memory application built around a so called DOS memory extender. This DOS memory extender was delivered with the Watcom C development system and provided to applications a true 32bit environment while running on top of the old 16bit environment used in Windows 3.x. Any applications wanting to use that environment had to be compiled with Watcom C. This posed problems when wanting to call normal Windows DLLs, since they were really 16-bit and the stack as well as pointers inside the 32-bit environment had absolutely no meaning to a DLL running in the 16-bit environment. So when passing information between the 32-bit LabVIEW application and the 16-bit OS environment the entire parameter stack and any pointers had to be properly translated.
Interfacing to Windows DLLs directly would have meant to translate auto magically every single parameter between the 32-bit LabVIEW environment and the 16-bit Windows memory model. For every single call, a complete new stack frame needs to be allocated in the 16-bit environment, parameters need to be copied from the 32-bit memory into the 16-bit memory and for pointers additional translations need to be done to make the pointers valid in the 16-bit environment. On return of the function these operations have to be reversed too. Watcom did provide some routines to deal with this translation, which basically was only possible by executing some involved assembly code internally, but the setup and configuration of those functions was a quite involved task. Therefore it was decided that this would be a to large development effort for an initial multi platform LabVIEW version. Instead the already existing idea of external code resources from the Macintosh OS was used and ported to the Windows and Sun OS. A Windows CIN back then was in fact a native Watcom REX object code file and a Macintosh OS CIN was a 68k object code file. The lvsbutil tool provided in the cintools directory wrapped this object code file into a CIN header and added the resulting file as another resource to the VI resources. This allowed LabVIEW to directly call the code resource in the same environment as LabVIEW itself was running, without involved memory translations that are difficult to handle automatically and also cause performance degradation. The disadvantage was that the Windows CINs only could be generated by the Watcom C compiler since they needed to be in the 32-bit REX object format, which no other compiler could generate.
I feel personally that the developers missed an opportunity here, when they did not allow for multiple code resources being added for the same CIN. This made it necessary to manually load the correct platform specific code resource into the VI whenever a VI was moved between platforms. This deficiency was never fixed in later versions of LabVIEW and caused a significant loss of convenience in using CINs for multiplatform libraries.
LabVIEW 4, Shared library support
LabVIEW 4 added the Call Library Node to interface directly to external shared libraries (Macintosh Code Fragment Manager components which were not standard for non PowerMac computers, Unix shared libraries and Windows Dynamically Loaded Libraries)
Due to the limitations of the Macintosh Code Fragments and Windows 3.1 16-bit DLLs, the supported data types were limited. For instance function return values other than void and numeric were not possible because of differences how Borland and Microsoft DLLs returned pointer types as function return values. More complex function parameters than string and arrays were also not possible because the parameters needed to be prepared accordingly for Windows 3.1 DLLs. To create an automatic thunking interface supporting more complex data types would have been a real nightmare to implement and therefore was left for a possible later version of LabVIEW. Because of the 32-bit <-> 16-bit translation in accessing external DLLs under Windows 3.1, this solution also had a lower performance than using CINs. Also developing code for more than one platform was not very straightforward when using DLLs. This made the use of CINs still the preferred external code solution in those days, especially when support for multiple LabVIEW platforms was desired or performance was an important issue.
LabVIEW 5, multi-threading
With the introduction of multi-threading support, the external code interfaces also got somewhat more complex in certain situations. To take advantage of multi-threading, the external code interfaces need to get configured to tell LabVIEW if it is safe to call the external code in different threads. For CINs, this was done by exporting an additional function from the CIN which tells LabVIEW if the CIN is safe for reentrant execution or not. LabVIEW assumed automatically unsafe behavior if this export is missing. This forces the CIN to always be executed in the only exclusively single threaded execution system, which is the UI system. For DLLs, there was no way to have an automatic way of telling LabVIEW, if a DLL function was safe for reentrant execution or not. Also a DLL could have multiple functions and some could be safe, while others could be unsafe and Microsoft never had anticipated that a programming environment might be interested in this information from shared libraries. Therefore a manual configuration option was added to the Call Library Node configuration dialog. Only the developer of the shared library can really know if a function is safe or not. The LabVIEW VI developer has to decide if he wants to set the Call Library Node to be able to call the function in any thread or force it to run inside the UI system, either because he has developed the DLL himself and understands the issues, or because he got this information from the documentation for the DLL. In many cases there is no specific information available in the documentation and in that case the best option is to leave the Call Library Node configuration to execute in the UI system. The alternative is trial and error, which can be cumbersome. Race conditions and other errors resulting from unsafe reentrant execution can occur randomly and at different moments, based on previous execution order of library functions. Also many external factors such as system load or used memory can cause randomness in how race conditions make themselves noticed. Often you can execute an unsafe function countless times from a multi-threading environment only to find that the application starts to crash or exhibits unexplained calculation results after it has been shipped to the other side of the world.
LabVIEW 6, Extended shared library support
In LabVIEW 5.1 support for 68k Macintosh and Windows 3.1 was dropped and this allowed for an enhanced data type support in the Call Library Node. A function now could return also a string and function parameters could be configured to adapt to the LabVIEW data type, as no complicated automatic thunking needed to be performed anymore.
In LabVIEW 6.0 this was even more extended with ActiveX datatypes for Windows platforms, additional selections for the Adapt to Datatype parameter selection and as a gadget, the selection of the available function names inside the shared library in a drop down box, but unfortunately only for Windows. Another nice feature added is the Create .c file selection in the context popup menu for the Call Library Node. With this you can let LabVIEW create a C header file with the correct prototype for the currently configured Call Library Node. This feature is especially handy if you happen to use the Adapt to Type parameter type, as LabVIEW obviously knows best how its own data types are to be declared in C syntax.
One indication that CINs are starting to be considered legacy technology by the LabVIEW developers, is the removal of support for creating external subroutines. These were external code fragments not loaded into the VI itself but instead being left as independent files on the file system, in order to be called by different CIN code resources. One of its applications were common subroutines, another one was providing a place to store global data among multiple CIN code resources. This was a fairly seldom used feature although the National Instruments NI-DAQ library and the LabVIEW Advanced Analysis library did make use of them before they were ported to shared libraries that were accessed through the Call Library Node interface.
LabVIEW 7, not much news on this front
LabVIEW 7 has not made significant changes to the possibilities of incorporating external code in comparison to the previous versions. The Call Library Node is quite mature and works well for almost any possible scenario and the CIN support has been further marginalized by removing almost any use of it in all the different LabVIEW function libraries provided by a fresh installation. Also National Instruments finished the port of almost all its hardware interface libraries and add-on toolkits to use the Call Library Node instead of CINs.
LabVIEW 8, A few more improvements
In LabVIEW 8.0 the Call Library Node was left mostly alone and no new feature was added. LabVIEW 8.2 however improved the Call Library Node further by adding error terminals to it, allowing passing the path to the shared library to be loaded at runtime, and it also added so called callback functions although this naming is in my opinion quite misplaced. Callback functions are usually function pointers that a caller can provide to be called back by a library or other external component. This callback functions can usually be called by that library at any time it wants to inform the caller about something. What LabVIEW 8.2 really supports is the configuration of initializing and deinitializing functions that the LabVIEW environment will call before and after calling the actual function itself and an abort function when the user aborts the user hierarchy while LabVIEW is in the process of calling that Call Library Node. Obviously this is only something that is possible for functions that are declared reentrant in the configuration.
LabVIEW 2009, 64-bit support
LabVIEW 2009 is the first version that officially shipped also as 64-bit version, albeit only for Windows for now (64-bit versions for Linux and Mac OS X were introduced with LabVIEW 2014, together with support for NI Linux Realtime for x86 and ARM realtime targets from National Instruments). Accordingly when running in the LabVIEW 64-bit version, all shared libraries that need to be called, have to be compiled as 64-bit library too. The Call Library Node got for that a new numeric datatype which is pointer sized. On the LabVIEW diagram this is always a 64-bit integer but when being passed to the shared library function, LabVIEW will perform the correct coercion to a 32-bit or 64-bit pointer value.