- I am trying to log a large number of A/D samples for testing of my device. I would like to be able to download 10 seconds worth of data. Each data point consists of 2 samples of A/D (encoded into 4 bytes each of ASCII), and a single control byte, each separated by tabs, and ended by a newline. Each data point is thus 12 characters long. 1000 data points are generated each second. After a test, the data is downloaded from the ESP-32 using
curl -d “{”"“filename”"":""“datafile.txt”""}" 192.168.4.1/rpc/FS.Get
I am partially successful at this (as described below). I think that there must be some limitations that are preventing ultimate success.
- My actions are to collect the data into arrays p, q and n at 1000 samples per second. I then call the following function:
int scanzone (char* ZSstring, int ZSlen, int depth){
fp=fopen("datafile1.txt","w+");
loggerstring[0]='\0';
for (i=0;i<loglimit/2;i++){
sprintf(dumstring,"%d",p[i]); strcat(loggerstring,dumstring); strcat(loggerstring,"\t");
sprintf(dumstring,"%d",q[i]); strcat(loggerstring,dumstring); strcat(loggerstring,"\t");
sprintf(dumstring,"%1d",n[i]); strcat(loggerstring,dumstring); strcat(loggerstring,"\n");
}
fputs(loggerstring,fp);
fclose(fp);
fp=fopen("datafile2.txt","w+");
loggerstring[0]='\0';
for (i=loglimit/2;i<loglimit;i++) {
sprintf(dumstring,"%d",p[i]); strcat(loggerstring,dumstring); strcat(loggerstring,"\t");
sprintf(dumstring,"%d",q[i]); strcat(loggerstring,dumstring); strcat(loggerstring,"\t");
sprintf(dumstring,"%1d",n[i]); strcat(loggerstring,dumstring); strcat(loggerstring,"\n");
}
fputs(loggerstring,fp);
fclose(fp);
logpointer=0;
return 1;
}
After this is complete, I attempt to download the data using:
curl -d “{”"“filename”"":""“datafile.txt”""}" 192.168.4.1/rpc/FS.Get
- The results depend on the size of the arrays p, q, n and the value of logpointer. If 3000 datapoints are taken using the above method, everything works perfectly. I get 3000 properly recorded datapoints in Base64 encoded form. If I attempt to increase to 4000 datapoints, I receive no reaction to
curl -d “{”"“filename”"":""“datafile.txt”""}" 192.168.4.1/rpc/FS.Get
(i. e. control returns to the command line prompt after about 1 second, and no data or error message is received).
I am monitoring the heap size that is available in both cases. With 3000 data points, the free heap space is about 160K, with 4000 data points, it is reduced to about 150K.
- I am sure that the sudden change that occurs between 3000 data points and 4000 data points is due to some system limitation that I am not aware of. Can anyone describe such a limitation (or indicate a better process to achieve the desired results)?
Thank you as always for your expert assistance,
JSW