Notices


Reply
Thread Tools
Posts: 58 | Thanked: 42 times | Joined on Jan 2010
#31
Originally Posted by RDJEHV View Post
if by crashing you mean a white screen and unable to load any page then yes, I have the same problem. didn't realise this problem could be associated with evopedia.
I fear this problem is indeed associated with evopedia, since it also occurs on my device. I'll try to start the browser in a different way and see if it persists.

I made the experience that the browser can be made to work again by closing all of its windows, waiting some seconds and answering "yes" to the dialog (something like "application is not responding, close it?").
 

The Following User Says Thank You to crei For This Useful Post:
Posts: 119 | Thanked: 14 times | Joined on Nov 2009
#32
ahh, yes. This workaround works for me. thanks. How can I fix the map problem?
 
Posts: 58 | Thanked: 42 times | Joined on Jan 2010
#33
Originally Posted by RDJEHV View Post
ahh, yes. This workaround works for me. thanks. How can I fix the map problem?
Please see my post on page 3.
 

The Following User Says Thank You to crei For This Useful Post:
Posts: 119 | Thanked: 14 times | Joined on Nov 2009
#34
I somehow missed that post. thanks.

It didn't quite work as you said. I somehow needed a reboot after removing the evopediarc for the map to work. Works like a charm now! many thanks!

now I only need an English version, although German work nice for me as well. I grew up close to the German border in the Netherlands so I understand most of it! Again Thanks!
 
Posts: 119 | Thanked: 14 times | Joined on Nov 2009
#35
I have a suggestion for improvement. Meap uses part of a bigger zoomlevel when he doesn't have a map in it's cache for the current zoomlevel. evopedia doesn't seem to be able to this.





furthermore the browsing on the map could be better, but this isn't the core of the program. It would be best to have evopedia integrated in meap. They have the ability to show wikipages on the map as well.
 
Posts: 58 | Thanked: 42 times | Joined on Jan 2010
#36
Originally Posted by RDJEHV View Post
I have a suggestion for improvement. Meap uses part of a bigger zoomlevel when he doesn't have a map in it's cache for the current zoomlevel. evopedia doesn't seem to be able to this.

furthermore the browsing on the map could be better, but this isn't the core of the program. It would be best to have evopedia integrated in meap. They have the ability to show wikipages on the map as well.
Thank you, these are good suggestions, but as you also already noted, I think these features are too complex for a simple javascript map application (at least on the N900). Integrating evopedia into maep or integrating maep into evopedia would be the right way to go, I think.
The next big step for evopedia is creating a stand-alone GUI application (in contrast to a web server) that uses an embedded HTML viewer like Qt's WebKit and an embedded map viewer like maep.
Let us get version 0.3.0 stable, then perhaps that will go into version 0.4.0.
 
Posts: 119 | Thanked: 14 times | Joined on Nov 2009
#37
How are the dumps coming along? I think if we have an English wiki dump we can get more users, that will help in getting feedback, and maybe some of them will be able to contribute. Maybe by then it would be wise to create a new topic with a more descriptive title to attract more attention. It worked wonders for the angry birds level pack.

Last edited by RDJEHV; 2010-02-09 at 14:00.
 
Posts: 58 | Thanked: 42 times | Joined on Jan 2010
#38
Originally Posted by RDJEHV View Post
How are the dump coming along? I think if we have an English wiki dump we can get more users. That will help in getting feedback, and maybe some of them will be able to contribute. Maybe by then it would be wise to create a new topic with a more descriptive title to attract more attention. It worked wonders for the angry birds level pack.
A new English dump (including rendered formulas) should be ready by the end of this week but I won't promise...
 
Posts: 4 | Thanked: 0 times | Joined on Feb 2010
#39
Hi, crei,

I'm working with the Vietnamese dump of wikipedia. However, after get the 'commons' part of wikipedia, the program always give me an message like this:
2010-02-11 21:31:38 URL:http://download.wikimedia.org/viwiki...ticles.xml.bz2 [122033246/122033246] -> "/home/xxx/data/evopedia_dumps//source_dumps/vi/wiki-latest-pages-articles.xml.bz2" [1]
Read error (Connection reset by peer) in headers.
then breaks.

Could you please give me an advice on how to solve this case? I'm just a noob in this field. Thanks!

P.S: I've tried to download the file manually, commented the "getSourceDumps $language" in the script the run it. After that, I've received another error:
Error: 1146 Table 'wikidb.page' doesn't exist (localhost)
As I understand, the guide in maemowiki tell me to create an empty database, so there should be no table in this one. So why I'm getting this message?

Last edited by sollos; 2010-02-11 at 17:21.
 
Posts: 75 | Thanked: 78 times | Joined on Jan 2010 @ Germany
#40
Originally Posted by crei View Post
…including rendered formulas…
That sounds interesting! I'm missing the formulas in the german dump!
How do I create a dump with rendered formulas? And what is stored in the database while creating a dums? (I would like to know this because I make a daily backup of my database, with the whole wikipedia this could become quite large )
 
Reply


 
Forum Jump


All times are GMT. The time now is 22:16.