How to use Bash in Puppy?

For discussions about programming, programming questions/advice, and projects that don't really have anything to do with Puppy.
Message
Author
williams2
Posts: 337
Joined: Fri 14 Dec 2018, 22:18

#21 Post by williams2 »

This script works like locate. Locate used to be a script, a long time ago.

Also, pfind finds files.
Attachments
f1.tar.gz
(1.78 KiB) Downloaded 135 times

User avatar
tallboy
Posts: 1760
Joined: Tue 21 Sep 2010, 21:56
Location: Drøbak, Norway

#22 Post by tallboy »

Mike wrote:Apparently, dpup-stretch's version of bash includes that command.
It is not a default package, you have to install it from the main debian stretch repo in the ppm.

It is NOT the same as mlocate!

coldmonday, if you write busybox --help in the terminal window, you get a list of all the various functions. The same command, -h or --help, can be used for all the separate commands in Busybox, and all other commands in Linux, and most of them will give a brief overview of usage.

If you want to see all commands available to you in Lucid or any other Linux, just press tab-tab in the terminal window...
True freedom is a live Puppy on a multisession CD/DVD.

coldmonday
Posts: 59
Joined: Fri 10 Mar 2017, 16:23

#23 Post by coldmonday »

6502',

Useful link, thanks.

I have (eventually) to create a spreadsheet with a few hundred temperature readings, They will need to be findable by day or date.
I will need to be able to find days and be able to compare them with other days etc.

Do you think 'find' will be able to locate individual files on Libre spreadsheets ?

At the moment all the readings are in small files on my XP laptop. so they will have to be transferred(rather, copied) to Puppy for working on.

Dave.

An aside, is it possible to recover a lost root password ?


Tallboy, I'll look into that.

User avatar
rufwoof
Posts: 3690
Joined: Mon 24 Feb 2014, 17:47

#24 Post by rufwoof »

coldmonday wrote:An aside, is it possible to recover a lost root password ?
Puppy logs you straight to root desktop, so just open a terminal and type

Code: Select all

passwd
and enter a new password for root (you have to type it twice in order to confirm that they match).
[size=75]( ͡° ͜ʖ ͡°) :wq[/size]
[url=http://murga-linux.com/puppy/viewtopic.php?p=1028256#1028256][size=75]Fatdog multi-session usb[/url][/size]
[size=75][url=https://hashbang.sh]echo url|sed -e 's/^/(c/' -e 's/$/ hashbang.sh)/'|sh[/url][/size]

User avatar
6502coder
Posts: 677
Joined: Mon 23 Mar 2009, 18:07
Location: Western United States

#25 Post by 6502coder »

It sounds like the temperature files must have names that indicate the date. It would help if you could give an example, so we know exactly what the filenames look like. Then we could give you detailed help on how to proceed. Does each file contain just one temperature reading for one day?

I'm not clear on what you mean by being "able to locate individual files on Libre spreadsheets". Do you mean that the names of the temperature files are (or will be) entered into cells on a spreadsheet? The "find" command only determines where files are by examining directories in the Linux file system; it can't read a spreadsheet and examine the contents of cells.

coldmonday
Posts: 59
Joined: Fri 10 Mar 2017, 16:23

#26 Post by coldmonday »

6502',

You have more or less answered my question in your post.

Basically, there will be 12 monthly folders. Each folder will hold 28 - 31 day folders. Each day folder will hold a list of temperatures (using libre spreadsheet) taken every five minutes throughout the day.

I've been taking these temperature reading for the last three years. The idea being that I can compare temperatures on particular days for each year.

So I need to be able to look for, say, 10:00 in the second of April for any of the years and compare them.

I gather that I could find the files by date, but not access the libre
spreadsheet data from Linux.

At the moment the files are .XLS on my open office spreadsheets.

The name format is N TL(05-01-2018@18.XLS)


This file starts at 18:10:06 on 5 January 2018 and ends at 08:52:06 on 8 January 2018.

There are 1436 readings taken . Actually there are 2872 readings as the logger takes two at a time.

The TL stands for Thermadata Logger.

So it is quite a big data set.

Dave.

User avatar
6502coder
Posts: 677
Joined: Mon 23 Mar 2009, 18:07
Location: Western United States

#27 Post by 6502coder »

Hi Dave,

As far as I can tell, what you want to do is perfectly possible. Most spreadsheets have a command line utility that will convert a spreadsheet file into a CSV file. Once you have a CSV file, it is a simple matter to do what you want.

Based on this article:

https://stackoverflow.com/questions/105 ... mmand-line

it seems the command in Libreoffice for converting a spreadsheet to CSV format is

Code: Select all

libreoffice --headless --convert-to csv $filename


Assuming this works, all we need to know is which values are in which columns in your spreadsheets. I'm guessing it's something simple like time in column 1 and temperature in column 2.

Try to convert one of your spreadsheets to CSV format and if it works, post a few lines of from the CSV so we can see exactly what the data look like.

EDIT: Most Puppies have the Gnumeric spreadsheet, for which the converter is "ssconvert" -- I believe this can convert XLS to CSV, so that's worth a try.

The syntax seems to be:

Code: Select all

ssconvert --export-type=Gnumeric_stf:stf_csv foo.xls foo.txt 
where foo.txt will be the resulting CSV file.

EDIT2: I forgot to ask, the filenames really look like "N TL(05-01-2018@18.XLS)"? Really? With the space after the N, the "@" sign, and the parentheses all part of the filenames?
Last edited by 6502coder on Thu 09 May 2019, 04:08, edited 1 time in total.

User avatar
6502coder
Posts: 677
Joined: Mon 23 Mar 2009, 18:07
Location: Western United States

#28 Post by 6502coder »

The name format is N TL(05-01-2018@18.XLS)

This file starts at 18:10:06 on 5 January 2018 and ends at 08:52:06 on 8 January 2018.

There are 1436 readings taken . Actually there are 2872 readings as the logger takes two at a time.
Sorry, please excuse me for being thick! I'm having trouble with the math here. If readings are "taken every five minutes throughout the day" that's 12 readings per hour, or 24*12=288 readings per day. The time span you cite from 5 Jan to 8 Jan is less than 3 full days, which would be well short of 900 readings, much less 1436. In fact it looks like 718 would be about right for the time span cited, with 1436 being correct if that already takes into account the "two at a time."

Also, this time span indicates that my assumption of one spreadsheet file per day is wrong; evidently there can be more than one day's worth of data (and possibly LESS?) in a single spreadsheet file. Also, the start and stop times look irregular. Does this mean that it is possible for some of the data for, say, April 1 to actually be in the "March" folder, because it was included in a spreadsheet file that spanned the end of March and the beginning of April? That can be handled, although it does somewhat complicate things.

And lastly, in this example, I guess the spreadsheet file would be found in the "05" day folder of the "January" folder. Since the data in this file extends all the way into part of 8 Jan, does this mean that the "06" and "07" day folders would contain NO spreadsheet files for 2018?

coldmonday
Posts: 59
Joined: Fri 10 Mar 2017, 16:23

#29 Post by coldmonday »

The times are a bit variable at the moment.

I started taking readings once every two minutes. But this meant I had to dump the full logger every couple of days.

So I changed it to every three minutes. That helped a bit. I'm going to 'rationalise' the system by changing the time to once every five minutes. May be better if I use once every six minutes. That gives me ten readings per hour and two hundred and forty per day; nice round figures.

Unfortunately there will be many hours spent 'correcting' the original files to make them fit this new format.

Should have used six minute intervals right from the beginning.

Not much changes with the weather in six minutes.

You are correct in assuming that there may be more than one days worth of data in each file. It has varied from three to four. And sometimes I have forgotten to dump the file to the computer and thus there are gaps.

The logger is in the greenhouse (it's a portable one and has to be brought to the computer to dump the data) with an internal probe measuring greenhouse temperature, and an outside probe measuring the outside air temp'. It is surprising how quickly the inside temp drop to match the outside temp overnight. A single sheet of glass isn't a very good insulator.

But it is a long time project, and it keeps me amused.

User avatar
6502coder
Posts: 677
Joined: Mon 23 Mar 2009, 18:07
Location: Western United States

#30 Post by 6502coder »

Hi Dave,

Here's a summary of my understanding of the problem and how a possible solution would work.

The Data
You have a main folder THERMADATA which has 12 subfolders, one for each month. Just for the sake of argument we'll suppose the folders are named "01" through "12". Each "month" folder in turn has 28-31 subfolders, according to the number of days in the month. These "day" folders are named "01" through "31".

Temperatures are recorded several times an hour, and these (time, temperature) pairs are recorded in XLS spreadsheets, one pair per row. The spreadsheets have names which indicate the year, month, day-of-month, and hour of the first (time, temperature) pair in that spreadsheet.

Each spreadsheet is stored in the month folder and day subfolder that corresponds to the first entry in that spreadsheet. For example, if the spreadsheet is named "TL05-01-2018@18.XLS" then this indicates that the first datum was recorded on January 5th during the 6PM (18th) hour, and this spreadsheet would therefore be found in the folder THERMADATA/01/05. Year is not a differentiator, so the spreadsheet "TL05-01-2017@02.XLS" would also be found in the folder THERMADATA/01/05.

The Goal:
Given a month, day-of-month, and time-of-day, we want to extract all the temperatures recorded, for all years covered in the database.

The Method:
Since we have the month and day-of-month we know exactly -- modulo factors to be discussed later -- which folder to look in. If the month is MM and the day-of-month is DD, we know to look in the THERMADATA/MM/DD folder. We will go into that folder, convert each spreadsheet there to CSV format, then run that through a filter that extracts only those entries that have the desired time-of-day. In bash pseudo-code it would look something like this (assume a time-of-day of 1305):

for f in *.xls
do
convertXLStoCSV $f | grep 1305 >> MM_DD_data
done

"convertXLStoCSV" is a hypothetical name for the spreadsheet format converter program. "grep" is a UN*X/Linux command that extracts lines that contain a specified string. So, for each XLS file in the directory, we convert it to CVS and pipe that into grep, which extracts the CSV rows that have the correct time-of-day. The results are collected in a file named MM_DD_data. We can then sort this data and extract values as needed for whatever analysis we have in mind.

The Fine Print
The "method" as described gives the essence of the solution. It would have to be embellished to accommodate these additional factors:

a) the data for a given month and day-of-month may be contained in a spreadsheet whose name does NOT reflect that day and month: this can happen if the spreadsheet contains data from more than one day, as in the example you gave previously of a spreadsheet with 3 days' worth of data. So we may need to grab XLS files from prior or subsequent days as appropriate. This is further complicated by the possibility that this might carry us across a month boundary.

b) the resolution and regularity of the time-of-day is an issue. If we ask for the data of March 14 at 0920, but a spreadsheet for March 14, 2017 contains entries only at 0916 and 0925, then what do we do? What counts as "close enough" to the requested time-of-day?

These complications are annoying, but certainly can be overcome, especially since your dataset is actually quite small. This allows the programmer to use code that is inefficient but easier to write.

coldmonday
Posts: 59
Joined: Fri 10 Mar 2017, 16:23

#31 Post by coldmonday »

6502',

Yes, you pretty much have it covered there.

I suspect I will have to come back to the above when I have spent some time getting a firm grasp of the essential Linux commands etc.

Also I have to re-arrange all the files into the same format. I.e sorting all the samples into lists of those at six minute intervals. That should while away a few evenings.

Below is a screen dump of a section of one of the files that has been transferred from ThermaData to >openoffice using the .XLS option.

This version is taking samples every three minutes.

It will be quite easy to re-list them every six minutes, just create a file taking every second reading from the original. The four and five minute versions will need a bit more massaging to make them fit. Taking the reading nearest to the six minute ones. A minute or two doesn't make any real difference in this application.

I'll reset the logger to sample every six minutes at the end of the month.

It's just basically four columns. five if I add an incremental day-number starting on 1st Jan.
Attachments
ttlxls.jpg
(73.35 KiB) Downloaded 309 times

coldmonday
Posts: 59
Joined: Fri 10 Mar 2017, 16:23

#32 Post by coldmonday »

I recall (I think) that somewhere on the site people were comparing the need to work from a 'home' directory and not from root.

I'd like to re-read that item but can't find it via 'Search'.

Anyone remember where it is (was) ?

Dave.

User avatar
mikeslr
Posts: 3890
Joined: Mon 16 Jun 2008, 21:20
Location: 500 seconds from Sol

#33 Post by mikeslr »

Of hand, I don't have exact links. But, beginning with (I think) FatDog64-700, it was structured to have and use a "Top-Level" /home folder --that is, at the same level as /root, /mnt etc. -- to house Chrome & Clone web-browsers (and, maybe, VLC) to overcome the problem that their publishers precluded them from running as Root > required that they run as a limited user. FatDog devs may have had other advantages in mind with that change. I don't follow its developments too closely.

Mike Walsh eventually structured his google-chrome64 publications to create and utilize such /home directory. The change involves more than a simple relocation. Files downloaded into the /home folder have different permissions. Root can access and work with those files within that folder; but if those files are moved in order to become generally available, their permissions have to be changed to /root. Similarly, before an application --such as a web-browser-- (maybe VLC) can make use of a file copied/moved to the /home folder, the permission of that file must be changed to Spot, the limited User built into Puppies.

You might want to examine the "Spot2Root pet" he published here, http://murga-linux.com/puppy/viewtopic. ... 040#985040 to see how it changes permissions. The pet preceded the relocation of /spot to the /home folder. Version of Google-Chrome after 65.0.3325.146 were built to include it and revise it somewhat.

User avatar
Keef
Posts: 987
Joined: Thu 20 Dec 2007, 22:12
Location: Staffordshire

#34 Post by Keef »

coldmonday

What is it that you think this would be necessary for?
The usual reason for plonking things in mnt/home is to avoid filling up a savefile (which are of a fixed size). If you use a save folder you don't have that problem, as it is just another directory on your HDD.

coldmonday
Posts: 59
Joined: Fri 10 Mar 2017, 16:23

#35 Post by coldmonday »

Mikeslr,

Thanks, I'll look at it.

Keef,

As one could completely screw up the operation system by doing something silly while in root, it seems logical to me to work in another directory to avoid just this possibility.

Pretty much the same way that it didn't seem a good idea to work directly from the DOS directory. Safer to jump straight to some working directory or other.

Belt and braces ?

Just thought I'd re-read the thinking on the subject.

Dave.

User avatar
mikeslr
Posts: 3890
Joined: Mon 16 Jun 2008, 21:20
Location: 500 seconds from Sol

Safe Experimenting under Puppy

#36 Post by mikeslr »

Hi coldmonday,

Frugal Puppies are pretty indestructible, provided you don't do something dumb like delete /mnt/home/puppy/vmlinuz or .../initrd.gz or .../puppy_version.sfs directly or by erroneous code*.

The core files of Puppy are initrd.gz, vmlinuz, puppy_version.sfs, and if present zdrv_puppy_version.sfs and fdrv_puppy_version.sfs. These are READ-ONLY. On bootup, they -- or parts of them-- are read into Random Access Memory. Unlike Windows, and almost every other operating system, your Actual Operating System exists only in Random Access Memory, and only until you shut-down. Only on reboot, will the same operating system be present; or a different operating system if you've modified the one READ-WRITE component of your system, the SaveFile or SaveFolder.

If you run under Pupmode 13, with SaveSession set to zero(0), (optionally ask at shutdown), you can test any application by restarting-X immediately after installing that application and without executing a Save to your SaveFile/Folder. Puppy re-catalogs what's on its system. The "installation" is only in RAM. So, if there's a problem, you can reboot without Saving, clearing the application from RAM and never having committed it to your SaveFile/Folder. Hence, never actually a part of your 'permanent' operating system.

I prefer SaveFiles: you can't modify them accidentally -- by deleting or editing files within them. They are only modified via the Save mechanism; or by mounting them, copying their contents to a folder, editing the contents, dir2sfs such folder and substituting it for the old SaveFile.

If you're really into 'treading dangerous waters', it makes sense to create a backup SaveFile/Folder. If accidentally or inadvertently you've screwed up your SaveFile/Folder, you can then boot pfix=ram, delete the problem SaveFile/Folder and replace it with your backup.

Additionally, applications created as SFSes never become an integral part of your System. They are loaded and unloaded as needed. But keep in mind they have lower priority in the "merged in RAM" operating system than other components: files and structures --such as python-- within such SFSes which conflict with parts having higher priority won't be used. Consequently, applications which might function if installed might not function if merely loaded as an SFS.

As mentioned, the /home folder was created to house the Spot limited user system primarily because of difficulties inherent in Google-Chrome and its clones. The Spot limited-user system preceded it, and is built into every Puppy (AFAIK). Any application can be run 'as Spot'. However, applications run-as-spot do NOT interact with other parts of your system.

Additionally, you might want to examine Barry K's development of 'Containers' and particularly rufwoof's discussion of them under FatDog64. I don't know to what extent Puppies, in general, are able to use Containers.

====
* When I was in the 10th or 11th grade, a friend bet me that he could prove that '2' equals '1'. Thinking of myself at the time 'somewhat of a math whiz', and certain he was bluffing, I said "Sure, go ahead." Well, in a series of about 9 algebraic steps he wrote equations beginning with a=2 b=1 and ending with these last two:
a=b
2=1

It was only be carefully substituting actual numbers in each of the steps that the trick became apparent: one step involved dividing (multipying?-it's been 60 years) by Zero.

My ego somewhat deflated, I've since held a healthy belief that the pen can be faster than the mind; and that its always best to know all the implications of a symbol.

coldmonday
Posts: 59
Joined: Fri 10 Mar 2017, 16:23

#37 Post by coldmonday »

..My ego somewhat deflated, I've since held a healthy belief that the pen can be faster than the mind; and that its always best to know all the implications of a symbol..

I take it you mean that in the case above 1 and 2 were being used purely as symbols and held no numeric value. Basically just A and B in different clothes.

Tricky blighters, these mathematicians.

That is why I don't trust any predictions that can't be proved by plugging in numbers and experimentation.

As for the original question. It's unlikely I'll be reaching the level needed to follow your post for some time to come.

Dave.

williams2
Posts: 337
Joined: Fri 14 Dec 2018, 22:18

#38 Post by williams2 »

or by mounting them, copying their contents to a folder, editing the contents, dir2sfs such folder and substituting it for the old SaveFile
If you mount a save file rw, you can easily copy/delete/modify/move any of the files and directorys in the save file directly.

You can prove this by copying any save file (the one Puppy is using if you like) to a temporary file, then mounting it. For example:

Code: Select all

# cd /mnt/home/
# cp bionicpup64save.3fs tmp.3fs
# mount -o loop tmp.3fs /mnt/data/
# rox /mnt/data/
# umount /mnt/data/
# rm tmp.3fs
You can modify anything in the file system in the tmp.3fs file directly.

A file with a sfs suffix is a read only squash file system. A squashfs is usually compressed.

A file with a .2fs or .3fs or .4fs is a save file. It is readable and writable and is NEVER compressed. It is usually the top layer of a layered file system.

This is a little bit oversimplified, but not much. For example, you can make an adrv.sfs file from a savefile and have it automatically added to the layered file system when Puppy boots. But usually a savefile is a savefile and an sfs file is an sfs file.

You can modify files in your savefile directly while it is mounted in the layered file system, but you usually should not do that. It confuses aufs. If you do modify files directly in /initrd/pup_rw/ you should reboot immediately after doing it.

coldmonday
Posts: 59
Joined: Fri 10 Mar 2017, 16:23

#39 Post by coldmonday »

It looks as if the need for extracting data from the temperature records has become rather moot.

I noticed a couple of week ago, on a particularly warm day, that the record for that day showed 42C on both inputs.

Now, it was quite warm, but not that warm.

Tonight I looked into the problem and found that both inputs are reading 9.3C high. And I can't trim out the error as the trim limit is 3C either way.

It has a new battery installed, and I checked that . 3.64 Volt against an ideal 3.65 volt. So that is good.

Looks as if I will have to send the device back to the manufacturer for fixing.

The real problem is that I don't know when it went faulty. so really all the readings are suspect.

Just have to write the whole project off and start again.

Major bummer.

You just can't trust electronics.

Dave.

User avatar
mikeslr
Posts: 3890
Joined: Mon 16 Jun 2008, 21:20
Location: 500 seconds from Sol

Searching is better than relying on memory

#40 Post by mikeslr »

Posted just to clarify my story.
coldmonday wrote: I take it you mean that in the case above 1 and 2 were being used purely as symbols and held no numeric value. Basically just A and B in different clothes.
Nope. While my memory, never great, is even more suspect than it used to be, I long ago realized that recorded information could be found and more recently realized that the Web, although it can be a font of misinformation, can also serve as a massive external memory.

So, here's the math trick https://www.chess.com/forum/view/off-to ... -proves-21:

Let...

a=1

b=1

That means

a=b

multiply both sides by a

a2 = ab

Subtract b2 from both sides

a2 - b2 = ab - b2

Factor

(a+b)(a-b) = b(a-b)

Divide both sides by (a-b)

(a-b) ///// (a-b)

Cancel

*crosses out both (a-b)*

a+b=b

If

a=1 and b=1

Then

1+1=1

2=1

Post Reply