Page 1 of 1

A Q of methods for importing variables.

Posted: Sun 24 Jun 2012, 06:47
by sunburnt
I think technosaurus kinda answered this awhile back.

# I wondered about the ram used by a hook file by the 3 methods.

1) If a literal string is used in a script, it takes ram up when it`s run.

2) A variable by argument, has to hold the value in ram somehow,

3) When a variable is exported, it`s value is stored in ram.

So there`s little difference in ram used by the methods... Correct?
The only difference I see, is 1 and 2 only use ram if the file`s run.

This seems to make good sense to me. Any thoughts or clarifications?
.

Posted: Sun 24 Jun 2012, 07:52
by jpeps
How many variables are you talking about? I tried exporting 25 strings, and got no diff running "free" before and after.

edit: made a mistake...got a difference. You can run a script at check it.

Posted: Sun 24 Jun 2012, 08:04
by sunburnt
But then, where is the variable=value held?

I`m guessing it`s reserved ram so it already shows as ram used.
That being case, it doesn`t matter as it doesn`t impact free ram.

I don`t know enough about the manner that Linux does this.

My main Q was any difference in the ram use by the 3 methods.

Posted: Sun 24 Jun 2012, 08:12
by jpeps
sunburnt wrote:But then, where is the variable=value held?

I`m guessing it`s reserved ram so it already shows as ram used.
That being case, it doesn`t matter as it doesn`t impact free ram.

I don`t know enough about the manner that Linux does this.

My main Q was any difference in the ram use by the 3 methods.
Edited my first post. Run a script, with "free" before and after all methods to see the diffs.

edit: too crude a test (too much sd).
edit: I don't see much difference with and without "export" over a repeated trials (although it would make sense that exporting would store them longer in the stack)

Posted: Sun 24 Jun 2012, 09:20
by amigo
Linux(kernel) doesn't have any thing to do with it -it's the sehll which is allocating memory (through libc). You will be hard-pressed to demonstrate any difference at all, I think. I wouldn't worry about how much RAM the sehll is using -except: When you use lots of functions you could see RAM usage increase visibly. For scripts under 1,000 lines I wouldn't worry about it at all.

My src2pkg contains more than 10,000 lines of functions which are sourced -that is read into RAM for later execution. With such an amount, you can see increased RAM usage. But that is offset by the quicker execution of the code. Interpreting while reading line-by-line from a script is slower that first reading the code into RAM and then running it later -at least for large functions. There is a difference between running code directly as opposed to calling it from a function when the code bit is small.

I use lots of functions because of all the benefits of doing so.

Posted: Tue 26 Jun 2012, 05:20
by technosaurus
as Amigo stated, the speed difference between using a function and a separate script can be extreme - this is especially true for something that is used many times such as recursive functions, but I wouldn't worry too much about the resources, I converted nearly every default puppy script to a function and it was barely noticeable for small scripts (busybox time said it was the same when using ash, but bash did show a slow down slightly - bash's time shows 1 more significant digit)

... but for larger scripts that called several other scripts, but now just called functions, there was significantly measurable speedups in both ash and bash (see the bashbox thread for more info)

Posted: Tue 26 Jun 2012, 20:45
by sunburnt
Thanks technosaurus; Function libraries are a farorite for portability.

Your right amigo; I tend to think of the kernel`s handling of everything.
But it`s more the piece of software that`s running at the moment.


I ran tests and the ram usage was all over the place,
not much difference between the 3 methods at all.