Live Traffic Feed
Recent Activity on Posts
Saturday, November 25. 2006
The Thanksgiving holiday and extended weekend have given me some time to spend on projects that have been tasking me. Primary among those tasks has been some way to figure out how to make our dear Serendipity35 run a bit faster.
Back in the summer, I discovered that the bottleneck through which Serendipity35 tried to squeeze its content was our backend database server MySQL. In July the loading of the database server was so heavy that I had to move it to a faster, less busy machine to prevent endless waits for pages to be delivered and to prevent random corruption of our main display pages. While this off-loading of the database processes stopped the random file corruption we were experiencing, it only sped up the delivery of our pages a little.
For about 5 years, when developing a new project, the database backend of choice for me has been PostgresQL. At first I didn't have any compelling reason to use PostgresQL over MySQL other than it had a simpler method of assigning database user and host permissions, but as time moved on, I discovered some things about the structure of PostgresQL that made me feel more comfortable in designing web sites and applications.
I'm not advocating using PostgresQL over MySQL in all database driven applications, but here, in an egg basket. is why PostgresQL can outperform MySQL in this blogging environment.
If we were sitting at the breakfast table on our Cyber Farm and we wanted to order an omelet and MySQL was our server du jour, everytime the cook wanted an egg MySQL would go and gather every egg on the farm in its basket and bring it to the cookÂ even if just one egg was requested. On a fairly large farm, the load of carryingÂ all theÂ eggs around all the time would considerably tire out (and slow down) our server.
If PostgresQL were our server and we ordered a 3 egg omelet, because it has a much smaller egg basket than MySQL it would have to make 3 trips from the chicken coop to the cook to deliver its 3 eggs. Having a small basket, though, and not having to gather every egg on the farm each time even one egg was requested, PostgresQL would make much faster trips and not need as many support resources. Our virtual breakfast could be enjoyed without waiting for the server to catch its breath and bring us our finished omelet.
Daily repastes aside, I decided I wanted to switch our server backend from MySQL to PostgresQL and bask in its glow of promised increased speed. But that database conversion process was slow and after many starts, stops, and hand edits, I couldn't get the data stored in the MySQL database to play nice with PostgresQL. I needed a fresh start and a different approach.
I built a new installation of Serendipity on my server at home and configured t to use PostgresQL as its backend server. After installing, by hand, Serendipity35's, users and categories, I set up an RSS feed to capture all of the current content and place it on my new "at home" server. You can see for yourself the increase in speed and page delivery. Not everything was a smooth transition, though. Articles had to be edited by hand to preserve the original author ID. phpPgAdmin saved my virtual bacon when it came to making those edits.
As of now the imported web log looks pretty good when viewed with Firefox, Mozilla, or Konqueror, but it loses its sidebar in Internet Explorer. The improper display in Internet Explorer may be a function of the Serendipity blogging software version 1.0.3a I installed. Serendipity35 runs on version 1.0.2
My next project for this rapidly evaporationg weekend will be creating an SQL from my home server's file that I can import into a PostgresQL server that supports Serendipity version 1.0.2 at NJIT, and soon make the switch to PostgresQL on this blog to improve its performance.
One project I'm NOT tackling this weekend is cleaning up my desk.
Wednesday, November 22. 2006
Hits, Sites and Kilobytes
There comes a time when anyone who has ever uploaded anything onto the web has had to stop and wonder, "Is anyone really looking at this?" and "How would I know if anyone did?"
On the simplest level you could open your webserver's logfiles and count all the accesses to your website and/or some specific files you are curious about. Though if you've never looked at a webserver's logfiles before, that method will cure you of any desire to ever do it again. On a less simple (and more geeky) level you could log into your webserver's host using a command shell and type some command like:
Each line in the how_many_hits_have_I file would count as one hit on your website.
Commercial webtracking tools like WebTrends and Urchindo a very nice job of displaying the traffic to your website, but those commercial tools have commercial pricetags. They often command license fees starting at around $900. Fortunately there are some open source alternatives.
Webalizer: An open source webserver logfile graphical analysis tool that features detailed reports of accesses to your server broken down by raw hits, page views, data transfer, referrer, browser, country and more.It also features a history file that tracks data by months and/or years to show long term trends. Originally only available on Unix and Linux based servers, Webalizer now has a version that runs under Windows and can analyze Microsoft IIS server files. Webalizer is configured andrun from a comand line, so some familiarity with using a command line tool is required.
Logaholic: Based on the open source tools PHP and MySQL,Logaholic is not free (currently $47), but its database backend allows for a more detailed analysis (by day and by hour, for example). Essentially a set of scripts, Logaholic requires that the PHP language be installed and it also requires read and write access to a MySQL database. It doesn't currently support Microsoft Windows servers.
AWStats: Free to use under the GNU Public License, AWStats combines webserver log tracking and analysis with FTP and mail server logfile analysis. It requires that your webserver supports Perl and CGI and can be run from the command line or from the CGI. It supports almost all platforms, servers, and logfiles and even claims to be able to analyze the logfiles created by streaming servers.
There are online services that also offer web tracking from a simple hit-counter that you embed in a web page to more sophisticated solutions (some free) that actually provide you with analytic statistics. Some of these hosted solutions may supply their reports with embedded advertising and be aware that they might sell your web site traffic information to third parties.
Besides website traffic analysis being a very good tool to hone your content and target your audience, just finding out statistically who is actually visiting your website can really help on those days when you feel like you are that lone voice shouting into the wilderness.
Tuesday, November 21. 2006
Leaving Joseph Knecht's future century gaming prowess, and Professor Rowley's 20th century published kvetching behind, we've splashed our way into the shallow end of the 21st century and discovered our own glasperlenspiel.
The landscape of the virtual classroom has, while still in its infancy, stretched beyond our control and our tactical ability to apply our own expanding teaching methods in this dynamic environment. The virtual classroom has left the school building and, given access to this new pedagogic province, outside the classroom each student is equal no matter what their background or educational goals might be.
The contemporary Trivium and its virtual curriculum has been embraced by the participants of the universal classroom. The grammar of the interconnected language is used to produce a collective logic and from that logic new critical thinking skills emerge. These new skills help produce the new rhetoric of our time.
We've combined music and geometry, astronomy and arithmetic by virtue of our new technologies. Apart from some intellectual seeding done by our traditional educational institutions, the combination of these fields of interest has been both accidental and intended by its participants. The independent creation of personal video content, music content, combined and cooperative astronomical and pure math endeavors on the internet have produced a collective instrument on which the entire intellectual content we possess can be played.
Learning to play this instrument has been the longstanding stated goal of traditional educational institutions. But, the investment of time and money in the traditional experience has left many who have overcome those hurdles without any incentive to make that traditional experience more open to to others who want to play.
Open ended learning has become the collective metaphor of our time. There are, and have been, initiatives taken at traditional schools to foster inter-disciplinary studies, but those initiatives exist within a closed system of learning. While there is great benefit in having guidance along the path to knowledge, those guides must be more than a handful of the select who point out volumes of selected knowledge from the entire stock of human experience.
Our new learning opportunity hinges on the open exposition that is presented to us, our evaluation of that expository content, the interchangeability of that content between an individual and groups of individuals, and our personal and public investment in the product of the learning experience.
Our challenge will be to reconcile ourselves to the end of permanent knowledge that dynamic and continuous learning will bring. And if knowledge, itself, is no longer an end, what goal will learning have? Perhaps we will just train masters of our new glass bead game.
Original content in this work is licensed under a Creative Commons License