Note

The Funtoo Linux project has transitioned to "Hobby Mode" and this wiki is now read-only.

Difference between revisions of "Package talk:MediaWiki"

From Funtoo
Jump to navigation Jump to search
(Created page with "nginx information for emerging nginx & setting up php should be in the nginx wiki called by a page for lemp. we should have a lamp stack meta wiki directing to apache php etc...")
 
m (yay alot of stuffs been cleaned up =D)
Line 1: Line 1:
nginx information for emerging nginx & setting up php should be in the nginx wiki called by a page for lempwe should have a lamp stack meta wiki directing to apache php etc. emerging nginx with cgi / fpm does not check that php has support for cgi / fpm which is needed for nginx to serve php. nginx / apache caveats should be minimized code blocks. basically i need to insert apache modrewrite caveats and have a shit storm to deal with here. the package isnt pulled in by emerge either, so i have a lot of work ahead of me to clean this page up. [[User:Threesixes|Threesixes]] ([[User talk:Threesixes|talk]]) 21:20, 27 August 2014 (UTC)
much of the cleanup had occurred.
 
connection problems tonight
 
generate sitemaps...if localhost/mediawiki or localhost/wiki sitemaps are placed @ localhost/sitemap-index-my_wiki.xml
 
<console>###i## php maintenance/generateSitemap.php --fspath ..</console>
 
obviously this needs more work....
 
robots.txt will go to seo but its worth documenting to this talk page commit....
 
syntax
http://www.robotstxt.org/robotstxt.html
 
database of spiders to use with robots.
http://www.robotstxt.org/db.html

Revision as of 07:47, November 28, 2014

much of the cleanup had occurred.

connection problems tonight

generate sitemaps.... if localhost/mediawiki or localhost/wiki sitemaps are placed @ localhost/sitemap-index-my_wiki.xml

root # php maintenance/generateSitemap.php --fspath ..

obviously this needs more work....

robots.txt will go to seo but its worth documenting to this talk page commit....

syntax http://www.robotstxt.org/robotstxt.html

database of spiders to use with robots. http://www.robotstxt.org/db.html