Guest - Testers are needed for the reworked CDateTime core component. See... https://forum.kodi.tv/showthread.php?tid=378981 (September 29) x
  • 1
  • 19
  • 20
  • 21(current)
  • 22
  • 23
  • 39
v19 Video Database Cleaner add-on
Properly formed all of the comment open and closes and it now works perfectly.

Thank you!
Kodi 17.6 on multiple Windows 10 x64 machines with shared MySQL 5.6.43 database. Kodi 18 on Xbox One. Content (music, video, photos) stored on Windows Server 2012 R2 file server and accessed by SMB.
Reply
(2018-10-18, 13:13)AnthonyB Wrote: Properly formed all of the comment open and closes and it now works perfectly.

Thank you!
 Good to hear Smile

The python library that the cleaner calls to read the xml files (elementtree) expects comments to be correctly formed and threw that error you got because one of them wasn't. I don't think there is any way of (easily) avoiding that, other than writing my own xml parser.   If you follow best practice and all comments have opening and closing tags then there's no issue, as you have found.

Actually, having just thought about it, it might be a good idea to catch that error and notify the user of the issue, rather than just grinding to a halt, so I'll have a look at doing just that over the weekend.
Learning Linux the hard way !!
Reply
It might also be a good idea to update the Wiki with properly formatted tags as needed by Kodi's parser as users tend to copy whole lines from there.
Reply
I was wondering, every time I run this add-on {which I appreciate a great deal}, I always get the same number of entries to be removed.

"There are 23 entries to be removed."

I am left to wonder, are these not being removed, or are they needing to be removed every time the add-on runs. I lean towards the former.
Reply
Here is a repost of what worked for me.

Ok, so I finally worked out how to get this working for me.

I ran normally (rerun showed the same number to remove every time)
Checked the log, in my case there were many that started /4tb1/ so I used the path section of the configure and deleted them
Then found the next most frequent.
rinse and repeat.
Now they are all gone and my library is clean.

Do not know why it would not work without specifying the path, but anyone having this same problem, this may be the solution

btw: it has been working normally since I did this, did not need to do it a second time

Regards Derek
Reply
Same issue as Derek above, I was getting the same 10 entries saying they were going to be removed, then the script saying they were removed, then the same 10 again on the next run.  I added each of the 10 paths into the "Remove specific path from database" in the addon configuration and this results in the entries actually being removed.

Is the SQL code for removal different for the "Remove specific path" function?  Seem like it must be to me.
Reply
Yes, the code for removing a specific path takes no account of what may be in the path, it just removes it from the database.  The 'general' cleaning code however attempts to produce a list of paths to remove by building a query that excludes any paths in your sources.xml plus whatever else you have set to keep (pvr, bookmarks etc) and then delete from the rest anything that starts 'rtmp(e):// , plugin:// or http(s)://'.  Possibly this part needs a re-visit as there could well be other protocols in there that the addon is not checking for.

Certainly, if you are running the addon and getting the same paths listed over and over without them being removed then I would think something is definitely amiss there.  I shall give the code a look over and see if I can reproduce the same issue here.  Might take me a few days though as I shall have to wrap my head back around code I wrote three years ago !!
Learning Linux the hard way !!
Reply
Thanks black_eagle.
That may help many users.
In my case /4tb2 etc were old sources. It may be that removed sources stay in the database and that's what tripped you program.
Hopefully there is a flag or similar that you can check for a defunct source.

Derek
Reply
Yes, I can confirm using the Log and removing them works. A little painful and slow. Be nice to be able to enter more than one path at a time and running it. With 23 entries, it takes some patience.

Still I am grateful for the Add-on. It solved some headaches I was having!
Reply
(2018-11-12, 21:59)glubbish Wrote: Thanks black_eagle.
That may help many users.
In my case /4tb2 etc were old sources. It may be that removed sources stay in the database and that's what tripped you program.
Hopefully there is a flag or similar that you can check for a defunct source.

Derek
Well, that at least gives me a scenario to test with.  I can soon add some dummy stuff to a database and ditch the source to see what happens.

However, I believe Kodi's own internal routines should remove stuff if you remove a source (depending possibly on how you do it).  Certainly, if that particular path is present in sources.xml, the addon won't touch it.  The other thing is that Kodi allows for sources to be present (with associated paths etc in the db) but not actually currently available.  E.G., they may be on a removable drive or a NAS drive that's currently offline.  This makes it difficult to determine if a source is really defunct or just not available at that time.  Testing for the presence of a path or file is easy enough, but for the reasons already stated, the lack of being able to access that path doesn't necessarily indicate that it's defunct and therefore the addon can't just simply remove it otherwise it might be wiping out very valid parts of a user's database.

I will certainly do some testing with some dummy paths and videos to see what's going on and the SQL statements the addon is forming to make sure they are 'right'.  If there happened to be an error in the sql or something goes wrong when trying to execute it, the addon will make a notification to that effect in the debug log, but not on-screen (perhaps there is an improvement to be made right there !) and the database will not be altered.

So yeah, I have a few ideas about what could cause it and a scenario to test with so hopefully I can come back in a few days with either a more definitive answer or a fix.
Learning Linux the hard way !!
Reply
Do any of those who reported problems use NFS shares? I do and I've noticed that the addon doesn't enumerate sources correctly with nfs://.. but display smb://... Of course, cleaning will fail in these cases since no such path exists in the database.

The result is that all objects in the database are marked for cleaning (in my case almost 2.000) because they don't fall under the exclusion of the defined sources, but none are cleaned (thank god) so the same count of objects is displayed the next time the addon is called.
Reply
Mine were old smb shares no longer valid.
Reply
The addon makes no distinctions between how a source is shared ( other than in certain cases to look for streaming sources ) and what is listed on screen should be exactly what it has pulled out of the database.  It operates completely with that information and the only actual 'paths' it knows anything about are for database back-up and log-file creation.

I use nfs shares as well as smb shares but I can't say it has ever listed my nfs shares as smb.  I wonder if there is a possibility that they are old paths that it's listing, but not removing them for the reasons I suggested earlier ?  A look inside your database @HeresJohnny in the paths table will show you if that is indeed the case.  From the queries the addon is able to generate, if it is showing you paths starting with smb:// then I'd have to say that those paths must exist in your database, regardless of whether or not they are actually valid or whether you expect them to.  The paths the addon displays on screen (and writes in the log) are exactly as pulled from the database.
Learning Linux the hard way !!
Reply
It should be that way, shouldn't it. I re-checked my sources.xml and did a search for smb:// through the whole database - without result. Everything is nfs:// ... I am as confused as you are...
Reply
I exclusively use smb:// shares.  The stale entries the script was picking up were old shares that no longer exist and are not present in sources.xml.
Reply
  • 1
  • 19
  • 20
  • 21(current)
  • 22
  • 23
  • 39

Logout Mark Read Team Forum Stats Members Help
Video Database Cleaner add-on5