geany files not transferrable between computers

Using applications, configuring, problems
Message
Author
User avatar
nubc
Posts: 2062
Joined: Tue 23 Jan 2007, 18:41
Location: USA

#16 Post by nubc »

Truth is, I chickened out at your suggestion. When you say, navigate to the folder, I assume you mean to actually open the folder. Additionally, I am guessing that no other folder should be open. I chickened out because I didn't want to accidentally remove Exec permissions in the entire home directory where the Index folder resides. I was not familiar with the back-tick call of the command prompt. Thanks for your suggestion. Will try it next week.
I wonder why the command didn't work when I did everything from command prompt. I cd'ed to the folder. Did a ls to confirm contents. Ran command. Exec checks not removed.
Last edited by nubc on Mon 21 Jan 2013, 04:29, edited 3 times in total.

User avatar
rcrsn51
Posts: 13096
Joined: Tue 05 Sep 2006, 13:50
Location: Stratford, Ontario

#17 Post by rcrsn51 »

I just tried it by going into the target folder strictly through the command line and it worked.

User avatar
nubc
Posts: 2062
Joined: Tue 23 Jan 2007, 18:41
Location: USA

#18 Post by nubc »

Next time I cold-booted the computer, when I viewed the test Index directory that apparently failed to respond to [chmod 644 *], files contained therein had in fact reverted to text file format, despite my initial observation to the contrary. I tried the back-tick technique on another Index folder, and observed the file changes instantaneously, so I am satisfied with this result. Solved.
Last edited by nubc on Wed 11 Sep 2013, 13:50, edited 1 time in total.

User avatar
nubc
Posts: 2062
Joined: Tue 23 Jan 2007, 18:41
Location: USA

#19 Post by nubc »

I forgot to ask: What happens if I try to open a geany file in Windows XP?

nooby
Posts: 10369
Joined: Sun 29 Jun 2008, 19:05
Location: SwedenEurope

#20 Post by nooby »

So this happen if one use geany to do text files with not .txt
what if one use Leafpad instead and still use fat format usb?
to do the transfer?

Is it the fat format or the lack of .txt that mess up or
would that get solved using leafpad editor instead?

One problem I have with Geany is that it does not like åäö
in titles?
I use Google Search on Puppy Forum
not an ideal solution though

npierce
Posts: 858
Joined: Tue 29 Dec 2009, 01:40

#21 Post by npierce »

nubc wrote:I forgot to ask: What happens if I try to open a geany file in Windows XP?
Short answer:

It will open, and probably look fine if using a fairly recent text editor.


Medium answer:

The reason I say "probably" in the short answer above is that the MS-DOS/Windows world and the Unix/Linux world have differing preferences when it comes to the control characters used to represent the end of a line of text. Years ago this was a bit of a headache. But modern text editors usually accept either preference.

As an experiment, I booted up an old PC that still had Windows 98 on one partition. Notepad refused to deal with the Linux NL (New Line) characters, represented them as black rectangles, and printed each line at the end of the previous line. But WordPad displayed the Linux files fine. And loading the file with WordPad and saving it created a file that Notepad could display.


Long answer:

The files that geany creates are simply text files, and can usually be displayed in any text editor because they use standard encodings (such as ASCII, ISO-8859-1, and UTF-8 ) and have no special formatting codes like, for instance, word processor documents or spreadsheets.

I say "usually" only because geany supports a lot if encodings, so it is possible to save a geany file with an encoding that isn't supported by every text editor. But if you stick with common encodings like UTF-8, or ISO-8859-1 (the default used for geany in the Puppies I've seen) you should be able to use any modern text editor.

The one fly in the ointment is that different operating systems prefer different control characters to signify the end of a line of text. How can that be? Didn't I just say that text files use "standard encodings"?

Yes. Well, we live in an imperfect world, and standards are not always as standard as we would hope them to be. :)

These standards are based on standards that were developed fifty years ago when Teletype machines roamed the earth. Teletypes normally used a pair of control characters to end a line: CR (Carriage Return) and LF (Line Feed). As the names suggest, the first returned the print head to the first character position in the line, and the second advanced the paper by one line.

Some folks thought that it was inefficient to send two control characters for each line when a machine could be designed to both move the print head and advance the paper in response to a single control character. So they lobbied for the inclusion of a NL (New Line) control character in the ASCII standard to allow for that behavior.

But ASCII was only a seven-bit encoding, which limited it to 128 possible elements, all of which were already in use. So it was decided to use the value that represented LF to also represent NL.

Wait. How can one value represent two different control characters? It can't. Users of ASCII had to agree amongst themselves which control character to use. And so the standard became less of a standard.
ASCII 1968 wrote:LF (Line Feed): A format effector which controls the movement of the printing position to the next printing line. (Applicable also to display devices.) Where appropriate, this character may have the meaning "New Line" (NL), a format effector which controls the movement of the printing point to the first printing position on the next printing line. Use of this convention requires agreement between sender and recipient of data.
So two data terminal operators could agree to use CR/LF or they could agree to use NL, assuming that they had machines that allowed them the option to choose. More likely, a corporation with a data network would buy equipment that all supported one convention or the other. Either way, everyone was happy.

But ASCII was originally developed for communication. When used for encoding files, the concept of "this convention requires agreement between sender and recipient of data" is hard to enforce. When was the last time you sat down with the people who created the text files on your PC and discussed what convention you should use? :)

Since it is not practical to have millions of text file creators discussing which convention to use with millions of text file users, the decision ended up being made by the folks developing operating systems, since system software was usually initially designed to support just one convention.

Of course, the folks developing one operating system might not choose the same convention as those developing another operating system. But it didn't matter at the time, since back then it was rare for a file written on one O.S. to be used on another O.S., unlike today.

Anyway, the Unix world settled on the NL convention, and the MS-DOS/Windows world settled on the CR/LF convention.

Using the value for NL with hardware or software that expects that value to represent LF would, years ago, cause the same behavior that would happen when a LF control character was sent to a Teletype machine: the paper (or the cursor on a CRT display) would advance to the next line, but the print head (or cursor) would not return to the first position in the line.

So this gave a sort of "stair-stepping" output, where the first line would start on the left, but each subsequent line would start under the end of the previous line. Using the MS-DOS commands TYPE and MORE on my old Windows 98 machine showed this kind of output for a file that used NL characters.


How to convert, if necessary:

Although Windows WordPad and probably many other editors should handle text files from geany that use the default NL convention, you can manually convert the files to use the CR/LF convention with this command:

Code: Select all

unix2dos test.txt
You can also save them directly from geany using the CR/LF convention by using geany's Document menu:

Document -> Set Line Endings -> Convert and Set to CR/LF (Win)

But, again, there is probably no need to do this. Chances are good that a modern text editor in Windows will open the files fine without you needing to do anything special.

npierce
Posts: 858
Joined: Tue 29 Dec 2009, 01:40

#22 Post by npierce »

nooby wrote:what if one use Leafpad instead and still use fat format usb?
to do the transfer?
It will be the same as when using geany. No matter what editor is used, if the filename has no extension (such as ".txt"), and any execute permission bits are set, clicking on it in ROX-Filer won't open the file.
nooby wrote:Is it the fat format or the lack of .txt that mess up or
would that get solved using leafpad editor instead?
Any Linux distro has many text files with no ".txt" filename extension -- and often no extension at all, so that is not the cause of ROX-Filer's confusion. Adding the ".txt" just makes it clear to ROX-Filer that the file is not an executable script, since when ROX-Filer sees a text file with no extension and any of the execute permission bits set, it assumes it must be an executable script.

So it is the FAT filesystem's lack of support for the execute permission bits that confused ROX-Filer (as explained earlier by rcrsn51).

Because FAT doesn't support execute permission bits, when translating a directory entry from FAT, these bits are set to a fixed value that is determined when the FAT filesystem is mounted. By default all have read and execute permission bits set, and the write permission bit is set for the owner (umask 0022). This can be changed with the umask, dmask, and fmask mount options. For instance, using option "fmask=0133" will turn off the execute permission bits for files. (You wouldn't want to turn them off for directories, unless you wanted to keep all users other then root from searching directories.)

(By the way, the man page for mount says the "default is the umask of the current process" for umask, dmask, and fmask. After some experimentation, I gave up trying to get it to work as described. While I've not made an in-depth look at the code, a quick look seems to indicate that mount began setting umask to a specific value by default way back in October of 2002. (First it used 033, then switched to 022 in December of 2004.) So it hasn't used "the umask of the current process" in over ten years.)

So if you mount a FAT filesystem using option "fmask=0133" the execute permission bits will be off, and clicking on text files with no filename extension in ROX-Filer will open the file in a text editor, unless it is otherwise recognized as a script (for instance, if it had "#!/bin/sh" as its first line). But the downside of that is that you would now have the reverse problem. If you were trying to copy executable files from the FAT filesystem, you would now have to turn the execute permission bits on after copying. (Also, in that case, executable files could not be executed directly from the FAT filesystem.)

I was curious if using extended attributes would make any difference, so I mounted an ext2 filesystem with the user_xattr option and experimented with setting the extended attribute file type to text/plain. This made no difference. Since the only text files that can be executed are scripts, ROX-Filer reasonably assumes that a text file which has execute permission and no filename extension is a script, and attempts to execute it when clicked.


Our Puppies do so much for us that we sometimes ask them to do things that they were not designed to do.

It is wonderful that Linux is able to access FAT filesystems, to allow files created with another O.S. to be shared with a computer running Linux. As a way to share files that might otherwise be unsharable, it is a positive, beneficial thing. But for sharing files between computers running Linux, it is generally preferable to use a native Linux filesystem.

Sure, the actual content of the files will transfer just fine with FAT. And if that is all one is interested in, then using a FAT filesystem is fine. But if you need Unix/Linux-specific things like the permission bits, or timestamps with a resolution greater than two seconds (a lack of which can make certain software believe that the copy of the file that you just made on your FAT filesystem is older than your original file), then a native Linux filesystem is a better choice.

Depending upon how one uses one's Puppy, one might rarely encounter problems using a FAT filesystem. But when a problem does arise, it can be a puzzler, leading to considerable head-scratching and time wasted trying to devise a work-around.

Now, having said all of that, what kind of filesystem do I use on my USB flash drives? Um, . . . ah . . . well I use FAT. :) I took the lazy approach of just sticking with the filesystem that they came formated with.

So, in this case I didn't take my own advice. On the other hand, I generally just use those drives for casual backup, and use the network for transferring files from one Linux box to another, not flash drives. And a FAT flash drive does come in handy for the occasional need to share a file with a Windows user.

User avatar
nubc
Posts: 2062
Joined: Tue 23 Jan 2007, 18:41
Location: USA

#23 Post by nubc »

I am transferring a directory, an index directory containing about 110 geany files. These files each contain a long list of names, separated by comma. I constantly update these files by adding names. On a weekly basis I transfer the entire Index directory containing 110 geany files to another computer. Before I make the transfer, on the destination computer, I delete the oldIndex directory, rename the extant Index directory as oldIndex, and then transfer the updated Index directory from the source computer. Last week, I discovered that the transfer of the updated Index directory from source to destination computers does not include all data. Specifically, several names were missing from a certain geany file, which should have been there. This was an embarrassment, so I was determined to correct the omission later when I got an opportunity to inspect the source computer. However, those missing names are present on the source computer, so I conclude that the data transfer between source and destination computers is incomplete or inaccurate. It may have something to do with the destination computer's desktop icon for "Index" which undergoes a transition when the Index is transferred, a transition from removed item to restored folder. Can anyone explain why this is happening?

musher0
Posts: 14629
Joined: Mon 05 Jan 2009, 00:54
Location: Gatineau (Qc), Canada

#24 Post by musher0 »

Hello, nubc.

Out of curiosity, what utility/program are you using for the transfer?
Linux's cp, a file manager, a file sync program?
Maybe you should double-check the parameters or settings you are
using for this transfer. A simple typo can create the most unreal error! :?

But it sounds as though the copy or transfer was interrupted somehow,
since the error occurred only this particular time. (Or has it happened
before without you noticing?)

Best regards.

musher0
musher0
~~~~~~~~~~
"You want it darker? We kill the flame." (L. Cohen)

User avatar
nubc
Posts: 2062
Joined: Tue 23 Jan 2007, 18:41
Location: USA

#25 Post by nubc »

Rox-filer to make the transfer between two wary 5.1.1 computers. I am guessing that the incomplete data transfer has been going on for a while, and I just noticed it. The transfer of the Index folder has always been problematic, as you can read in this thread.

If the solution to this issue isn't obvious, I will probably stop the practice of saving the old Index directory, and instead, just delete it and then transfer the updated directory.

User avatar
rcrsn51
Posts: 13096
Joined: Tue 05 Sep 2006, 13:50
Location: Stratford, Ontario

#26 Post by rcrsn51 »

@nubc: You have failed to mention the most important piece of information - that you are transferring the files via a USB drive.

So the first thing to check would be the state of the files on that drive.

Are you unmounting the USB drive properly before removing it from the source machine?

User avatar
nubc
Posts: 2062
Joined: Tue 23 Jan 2007, 18:41
Location: USA

#27 Post by nubc »

Point taken: there is an intermediary stage of the file transfer, which is the temporary storage of data on a fat32 external USB hard drive. My practice is to "move" the Index directory when making the final transfer from USB drive to destination drive, so there is no evidence of the transfer left on the USB drive.
Yes, I exercise every precaution when using this USB drive, mounting and unmounting, so the problem is not my carelessness. This USB drive is not 100% reliable, as there are times when it simply crashes and and the transfer has to be restarted. The weekly transfer involves two directories. The text-based Index directory takes about 3 seconds to "copy" from source to USB drive. The other directory can take up to 10 minutes to transfer, and this transfer is typically the one that fails when the drive crashes, which is annoying but not that frequent.

One observation I can make is how clean the omissions seem to be on the transferred Index directory. There are no partial names or any obvious truncations. Certain names are simply missing. It makes me think this is an old Index, long since deleted or renamed. On the other hand, the names that are missing are so essential to the list, the names must have been put on the list many versions before the ones that currently reside on the destination hard drive.
Last edited by nubc on Wed 11 Sep 2013, 11:30, edited 1 time in total.

amigo
Posts: 2629
Joined: Mon 02 Apr 2007, 06:52

#28 Post by amigo »

The easiest way to avoid the various problems is to archive the folder before saving it to the fat filesystem. This will avoid the permissions and ownership problems, as well as any conversions between DOS format and UNIX format (line endings).

Post Reply