Steve-Law | |||||
Now I know this is trivial, and only really a problem for those of us with some kind of automated update system ourselves, but I just noticed a small problem with the Market Update timestamp. Thought it best to mention it. First of all the time seems to be about 5 minutes fast. But also, and probably this is more important, I just watched the time change over several minutes while the update was actually in progress. There were about 5 starbases in the list at 9:45 - last update said 9:50 (yes, same date ![]() At 9:47 quite a few more were in the list (page refresh) while the last update said 9:52. I just checked again now at 9:55 and the last update says 9:53 (seems to have finished now). It would be best for us if the last update timestamp was only updated when all updates have been completed. (Unless something else odd is going on.) | |||||
Mica Goldstone | |||||
I am fairly certain that the time stamp is taken from the ISP at the time of last adjustment to the data. This means that it will be continually changing for the period of upload. It is not actually date stamped by us. | |||||
ptb | |||||
I probably should have checked this, i just assumed it updated only once. Hmm that could cause some intressting errors in the market ripper tool. Or at the very least mean you do more than one 'rip' per day. Which i'm sure the kjc website host love ![]() ![]() Steve: a simple soultion would be to make the ripper to a check time then wait in a loop until the time stops changing. Would want to wait about 5-10 minutes each time i think, just to be sure. Alternativly make it ignore the rip the first time the page is changed and wait for the next cronjob (one hour i belive you have set) by which time it should be sorted. | |||||
ptb | |||||
Steve you'll notice that this is exactly the problem we had with the last den market ripper, due to clearing of current data and a progressive update of the data. The current system doesn't suffer from that of course due to the use of the tempoary tables, well except for about 0.5 seconds as the data is moved from one table to the active table. Just a shame mysql doesn't support views then there would be no changeover time at all. | |||||
nortonweb | |||||
Hi Is anyone else running automated update systems having an issue with the markets site? When attempting to connect to the page (which via a normal http browser session I can see) I get back a The requested URL /1024/dataStore/markets.php was not found on this server. 404 error... has the sever changed in the last 2-3 days? Is it checking that a session call is not coming from a ripper? Is my code just gone crap without me touching it (normally I need to touch it for it to turn to crap)??? The KJC site has been very up and down... Any ideas??? Peter | |||||
Mica Goldstone | |||||
Our ISP is doing an bloody great overhaul and taking their sweet time: CGI / Frontpage Data Migration - 25th Nov to 3rd Dec Service Affected:- CGI / Frontpage Maintenance Window:- 24 Hours a Day - 25th Nov to 3rd Dec Detailed description of work to be performed:- We have had an ongoing project in place to migrate all of our CGI and Frontpage data across to a new storage platform to improve the performance and fix a number of outstanding issues. We were originally going to run this process out of hours to avoid any customer impact. In light of the recent spate of CGI and FP problems, most of which were caused by the current storage, we have decided to bring this process forward to completion as soon as possible. As such, we will be running the process 24 hours a day until completion, starting immediately. This process will still take a long time and we cannot say for certain exactly when the process will finish. We will make a further announcement when the process has been finished. Expected Customer Impact:- During the data migration, customers will notice a short period of time when they are unable to access their site by any means. This is the period when the data is being copied across and will only happen once per customer. Other Notes:- If any customers experience problems with the site they should contact support who will investigate the issues. Regards, Josh Berry Customer Support | |||||
ptb | |||||
Works find for my ripper (and so i assume for the den ripper too), maybe something changed in the page that your looking for? in way that it didn't effect how i collect the data. At somepoint i have to get them to give the market data out in a nice manner. Edit: ignore me, obvoiusly i just ran it before the update started ![]() | |||||
Steve-Law | |||||
http://www.dewiek.net/ripperlog.php Had some problems mainly when it tried to update and can't see the page but I have the ripper set to try several times a day and so it seems to catch it at some point. It didn't yesterday though, but it's not a major issue, this is not a permanent problem after all. (Note the problem with the temp tables is an occasional recurring one and I need to figure out what's going on, but as it tries every few hours it usually gets it in the end so it's a back-burner job for now.) | |||||
Steve-Law | |||||
Are you viewing the page from your local cache (hence you can see it but your server can't)? Or maybe you were just lucky that it was briefly working at the time you viewed it. It does seem to pop up and down while it's being "migrated" (ouch!) | |||||
nortonweb | |||||
Sadly its the up and down :-( I have cache set to off (being a development machine it makes things easier) so it must have just gone in at the right time as straight after I sent off the post it started to connect ok and stopped sending me alert emails (a great idea until the system hits a problem then you get a whole heap of emails). Well it looks like I'll have to ride this through it just as I'm in Oz my up time is their down time :-) Ho Hum modern technology Bill !!!! Thanks everyone... Pete | |||||
ptb | |||||
Considering the number of people that appear to run data ripper like tools on the kjc site, you've got to wonder if they won't save a lot of bandwidth/cpu time by having all the relvent data, updated after the full update is run, in a single file and gzipped. serving a single compressed static file is a lot easier, plus you can use the http last-modified header to find out if it's a new one (which means no downloading a whole page just to get the time) Although mostly i suggest this as it'd make my life easier ![]() | |||||
Steve-Law | |||||
After continual intermittant problem with the ripper since the whole CGI thing (and to be honest, before) I would re-request again that we get something sorted out for this please. Yes, just have Phoenix dump everything into one text file, comma delimited or whatever, in a known format when it updates your database. Something. Please? Something quick and easy not to take your time away from the real game improvments. But I would submit that the kinds of developments we are working on only serve to enhance the game as a whole. (Not sure what the problems are with the ripper, or if it's just me on my server. It seems to not read all the data one time, but you can run it straight afterwards and it gets it, or get errors 10 times in a row while still browsing the market pages fine. It seems to usually update first time late at night, so maybe its a bandwidth thing? I don't know, I just know the current way to do it is really troublesome and far from ideal. Especially seeing as how several people want to access this same data in similar ways (more than just the three of us here I'm pretty sure)) | |||||
nortonweb | |||||
You gotta agree with Steve here. Even if its down to an RSS type feed that returns a whooping great big XML file it would be better for everyone (even the non-rippers) to not have all these connections to the server running. Especialy the scripts that are polling as well as ripping. cheers Pete | |||||
Frabby | |||||
In BSE, we had a zipfile generated that had all public market data available in .txt format. I would like that back. I work offline most of the time. When checking markets, I load each and every reachable starbase mr into a netscape register card and switch between these. A textfile would be just as good, and also easy to use for advanced scripts to make use of the data. | |||||
Mica Goldstone | |||||
I will have a word with David when he gets back. It is probably not that much trouble to write it all to a text file and upload it. | |||||
ptb | |||||
you could even gzip it to save bandwidth ![]() | |||||
Steve-Law | |||||
Any news on this? I know David is busy and it will come after whatever he's working on now, but just a little conformation that it's definately coming would be nice so I can put off that part of it while I work on other stuff ![]() | |||||
David Bethel | |||||
i'm afraid i got bogged down in something 'clever' which is causing lots of work on the compiler bits. Its now on the list for the bug fixes after the order editor. | |||||
Mica Goldstone | |||||
There is now a link on the website (under downloads) that downloads the entire market as a text file. Latest Market | |||||
Steve-Law | |||||
ooo! How exciting ![]() | |||||
Steve-Law | |||||
Any chance you can add the Market message to this as well please? ![]() Also just realised that there's no indication of item type in this file, but I suppose we might be able to just check the items webpage for that? (if it works) | |||||
ptb | |||||
Cheers ![]() Okay seems be a ascii attachment, and it's generated as data is uploaded so we still need to handle the date/time stuff Steve ![]() Also i'd recomend using the http 'HEAD' request to check the content size, and only doing the GET when it stops changing, thats what i'm gonna do anyway. ![]() | |||||
Steve-Law | |||||
Won't we need a direct address to the txt file itself? | |||||
ptb | |||||
It would have made life easier, however this site http://web-sniffer.net/?url=where the markets are shows the head and that "Content-Length: 503635" is part of the head string for the php, this value was a lot lower while the update was runing (it appears to have already run for today btw) and so seems to accuratly reflect the file size. of course as we still need to check the date stamp for when it starts to change we could jsut use when that stops changing too ![]() | |||||
ptb | |||||
This is my way of doing the rip with the new market data file. ripper.phps This is the actual source file. See the weird syntax highlighting apache does by default on php source ![]() ripper.mysql.sql The MySQL database setup I'm using. markets.xml xml file to see the market data I read in, updated when i detect the kjc update is done and using a the bait and switch file write method so theres no partial files ![]() ![]() | |||||
Frabby | |||||
Only just found this. For people like me who tend to work offline this is a great help! Cheers Mica/David! | |||||
Steve-Law | |||||
Hmm. Would a tool that will display the markets from this file be useful for people then? | |||||
Frabby | |||||
Although I cannot speak for everyone here, I certainly do think it would be useful. A few ideas on what a Reader program should do: - Buttons for each affiliation to include/exclude - Buttons for each system to include/exclude - Sort by system (as on the KJC site) - Sort by item (as on the KJC site) | |||||
ptb | |||||
Steve couldn't you modifiy your market pages so you had a version that only went online when told to get the new datafile then used that in offline mode? could just distribute the webpages then (as all your sort code is javascript side already) | |||||
Steve-Law | |||||
Except that the pages are generated by php from the mysql database. It would be a complete rewrite with only the front end remaining more or less the same. |