" in the query string to enforce it.Īnd yes, you may run into another issue in that the SELECT for double precision uses the Labview primitive "Fract/Exp String To Number Function". So the solution, as you say, is to use "%. replacing the VI "SQLite_Select DBL" with "SQLite_Select Str" and adding the Value False to the VI "Fract/Exp String to Number" (this modification was also done for the timeout case)ĭo you think these workarounds would solve the whole issue with the decimal point? Are there any other aspects I've neglected?Īlways use "." instead of "," as the decimal point even if the locale requests ",". I've tried the following workarounds by forcing the vi to use point as decimal point each time data is converted between string and value, and the example vi seems to work now with comma as default decimal point of Windows setting (see attachment): My assumption is, the time stamps for the start and end of the time range are causing the error message, since they are formatted using the system default decimal point, which is in my case a comma. If you work with DBs, you have to learn SQL. The syntax is standard SQLite SQL which is 99% compatible with M$ and MySQL. You can have parallel reads (but only a single write without getting busy errors) without going through all the examples and documents: Is there any description for the syntax used to where clause?Ĭan you post the error? Row and column count return integers and should not have anything to do with decimal points. Is it possible to manage and make parallel access to several "tables" within the same database or they have to be held in different databases? Or maybe you know a elegant workaround, without having to change the setting on Windows. Is it possible that the tool kit also works with numbers with comma as decimal mark (for Germany), coz when I first tried the zooming out, the VI "row col count" has reported error and the error was eliminated, after I had change the Windows settting for decimal mark to point. I still have some questions and I would appreciate it very much, if you could provide me some answers: Your tool kit seems to solve my whole issue which actually also includes the aspect of zooming. The issue would be whether you could log continuously at >200Hz - maybe with the right hardware and a bit of buffering. There is the "Data Logging" example which demonstrates this exactly in the SQLite API for LabVIEW. Has anyone ever dealt with this kind of issue and could provide me with some tipps or experiences? Thanks in advance! What I have been thinking about is, trying to stream the accquired data to the hard disk and read the decimated data points into the memory for plotting. Since the XY-Graphs have to be replotted periodically (at least quasi in real time), it seems to me that, the whole range of data somehow has to be available, no matter if decimation is being performed or not. Such acquisition typically lasts several days and a very large amount of data is therefore being created, so that the memory of the PC could run out very quickly, if the data is kept in the memory of the PC. My current application has to deal with real time data acquisition (200Hz or above) on multiple channels simutaneously and plotting the acquired data on XY-graphs (including decimation) in real time.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |