Discussion:
speed-up icacls
(too old to reply)
Ammammata
2021-08-12 07:09:13 UTC
Permalink
I run the below command

icacls "D:\test" /grant John:(OI)(CI)F /T

to be sure John has full access to all files and folders in d:\test

This because he gets random "access denied" errors, on files saved by other
users

Currently the a.m. folder contains quite a lot of files:

Total Files Listed:
163647 File(s) 176,384,640,967 bytes
99338 Dir(s) 285,917,642,752 bytes free

It takes ages to complete, slowing the access to all other users

currently it runs by night, so who cares? but sometimes, mainly after huge
updates from other users, I need to run it during worktime

how can I speed it up?

is there any valid alternative to it?

TIA
--
/-\ /\/\ /\/\ /-\ /\/\ /\/\ /-\ T /-\
-=- -=- -=- -=- -=- -=- -=- -=- - -=-
........... [ al lavoro ] ...........
Kerr-Mudd, John
2021-08-12 09:59:22 UTC
Permalink
On Thu, 12 Aug 2021 07:09:13 -0000 (UTC)
Post by Ammammata
I run the below command
icacls "D:\test" /grant John:(OI)(CI)F /T
to be sure John has full access to all files and folders in d:\test
This because he gets random "access denied" errors, on files saved by other
users
163647 File(s) 176,384,640,967 bytes
99338 Dir(s) 285,917,642,752 bytes free
It takes ages to complete, slowing the access to all other users
currently it runs by night, so who cares? but sometimes, mainly after huge
updates from other users, I need to run it during worktime
how can I speed it up?
is there any valid alternative to it?
TIA
--
/-\ /\/\ /\/\ /-\ /\/\ /\/\ /-\ T /-\
-=- -=- -=- -=- -=- -=- -=- -=- - -=-
........... [ al lavoro ] ...........
--
Bah, and indeed Humbug.
Kerr-Mudd, John
2021-08-12 10:03:17 UTC
Permalink
On Thu, 12 Aug 2021 07:09:13 -0000 (UTC)
Post by Ammammata
I run the below command
icacls "D:\test" /grant John:(OI)(CI)F /T
to be sure John has full access to all files and folders in d:\test
This because he gets random "access denied" errors, on files saved by other
users
163647 File(s) 176,384,640,967 bytes
99338 Dir(s) 285,917,642,752 bytes free
It takes ages to complete, slowing the access to all other users
currently it runs by night, so who cares? but sometimes, mainly after huge
updates from other users, I need to run it during worktime
how can I speed it up?
a) Reduce the number of files
b) ensure John is granted rights by the programs that create the files.
Post by Ammammata
is there any valid alternative to it?
TIA
A deeper question:
Why does John need to look at/change on all these other users files?
--
Bah, and indeed Humbug.
Ammammata
2021-08-12 10:28:54 UTC
Permalink
Il giorno Thu 12 Aug 2021 12:03:17p, *Kerr-Mudd, John* ha inviato su
Post by Kerr-Mudd, John
a) Reduce the number of files
no way, right now, it's the history of all issued orders and must be kept
online for at least 5 years (or more)
Post by Kerr-Mudd, John
b) ensure John is granted rights by the programs that create the files.
most of those docs (pdf, jpg, etc) are coming from foreign manufacurers, I
can't help on this: they arrive by mail, are saved on the many users
computer, first locally, then moved to the "archive"
--
/-\ /\/\ /\/\ /-\ /\/\ /\/\ /-\ T /-\
-=- -=- -=- -=- -=- -=- -=- -=- - -=-
........... [ al lavoro ] ...........
JJ
2021-08-13 15:04:27 UTC
Permalink
Post by Ammammata
Il giorno Thu 12 Aug 2021 12:03:17p, *Kerr-Mudd, John* ha inviato su
Post by Kerr-Mudd, John
a) Reduce the number of files
no way, right now, it's the history of all issued orders and must be kept
online for at least 5 years (or more)
Post by Kerr-Mudd, John
b) ensure John is granted rights by the programs that create the files.
most of those docs (pdf, jpg, etc) are coming from foreign manufacurers, I
can't help on this: they arrive by mail, are saved on the many users
computer, first locally, then moved to the "archive"
I'd suggest processing only the newly added files. Use a script/tool to
monitor the folder for any newly created file and put them in a "new files"
or "files to process" list file. Use the list file to process the files. A
list entry should only be removed from the list when it has been
successfully processed without any error. Otherwise, keep it in the list to
be processed later, either at the next occurence of a new file, or with
another script/tool which processes the list file periodically.

For old files... it has to be done the long way. This is to make sure that
old files are already processed.

Loading...