Backup is a subject which comes up a lot on the rack as regular readers will already have noticed and I have already detailed several of the strategies we employ to make sure we have a myriad of copies of our essential data spread across the network, preferably as far apart as possible! Our latest investigation concerned how to get a usable copy of our most precious and bloatie database from our SQL Server 2000 installation across the VPN (and therefore Cheshire) on a regular basis.
Time for a topical and amusing dip into Google images for the word of the day - bloated, as in a very large database. This little marmot, who obviously has a small pie problem represents our database for this afternoon.
The data to be moved is 1.3 GB over a 2MB line from a Windows 2003 Server to a Linux partition on our CentOS virtual machine. Given all the other scheduled adminnie jobs we have going on overnight I cannot afford for the whole process to take more than about 30 minutes, I sometimes think the network is busier out of office hours!
The solution is of course to compress the backup before sending it and it turns out that after a little investigation there are a couple of products on the market which do this for you. After only a brief search I found SQL Backup by Red-Gate Software and Lite Speed for SQL Server by Quest which both offer on the fly compression and encryption to boot. One might have expected Microsoft to include a compression option in their rather expensive server system but one would be wrong as usual (thanks Bill). Bit less time writing the EULA and more on the software next time eh?
Moving swiftly on, the products mentioned above are relatively simple in their approach in that they add some system stored procedures to your SQL Server install which can be scheduled to run, adding compression and or encryption to a standard full or differential backup. The cost is quite manageable as well at $45 to $399, which is a lot less than the time would cost to build our own script! But incidentally if anyone is interested here is a start. Installing the programs on our test bench was easy and initially everything was very straight forward until we tried to send it to our Linux machine....
In order to share a folder on Linux one requires the cooperation of a service called SAMBA which is really quite powerful and therefore complicated, sharing a folder to any Tom Dick or Harry is very well documented and quite easy. Authenticating against our active directory and sharing with our windows machine requires the services of a good soothsayer however and is somewhat sparsely documented!
Having finally got a shared folder onto the windows network I thought the job was done but unfortunately I hadn't counted on a couple of less well documented features of SQL server.
1. SQL Server does not like backing up to network shares that are not in the same work group
2. SQL Server will not offer to re authenticate, the user account SQL Server logs in on must have explicit and full access to the network share.
I would love to give a blow by blow account of how I got the network share going but I have been chipping away at this problem sporadically and unfortunately I have sort of lost track of how I got where we are.
Having taken a while to sort this I have finally got some comparative data and I am very impressed indeed! Given that a 2MB line is really quite modest, SQL Backup managed to compress, encrypt and squeeze a 1.4 GB database down to 250 MB and ferry it across Cheshire in 20 minutes and 10 seconds! It took me a while longer to get Lite Speed up and running but at only $45 it ripped through the compression and transfer in a mere 10 minutes and shaved 30 MB of the storage requirements at 220 MB! Its my new best friend and I would recommend it to anyone looking to get their database backups as far away from them as possible. The only thing left to sort out is that I cannot afford to change the logon account for the main server as I did on the test machine so hopefully I can get SAMBA to cooperate with the existing setup this time :o)
Monday, 30 April 2007
Wednesday, 18 April 2007
Nagios - The Final Word
Having posted about installing Nagios on CentOS I have finally had a few comments on BeerBytes (Hooray). I have also featured on 2 little news posts on other Linux sites as well click here for the latest and check out the blogroll right for a proper link. Given the obvious appetite for this subject (no not me - network monitoring) I suppose its only fair that I should continue dispensing nuggets of information given that its the only subject which has raised a flicker of interest.
Having got the basics set up I spent some time last week getting the network diagram straight, as we have quite a busy network with 60 nodes I wanted to monitor, the automatic function did not do it justice. Moving on from this I fell foul of having installed the incorrect plugin archive, for some reason check_ping worked fine but I wanted to start monitoring DNS, MySQL and HTTP on a couple of servers and these plugins would not run. In order to test your plugins you can simply move to the plugins directory and type ./pluginname, for example ./check_ping --help will tell you what command line parameters the ping plugin requires then try it again with these supplied. To continue the example ./check_ping -H www.yahoo.com -w 1000,10% -c 1000,10%
returns PING OK - Packet loss = 0%, RTA = 84.63 ms. The commands are already set up for the standard plugins in commands.cfg if your install was ok.
So to sort this problem out I ran off to try my old mate DAG's archive :o) I found the appropriate rpm and bingo, everything works well. If your are relativly new to Linux as I am and you are struggling to connect to the repository on yum or using rpm there is a quick and dirty work around. Simply find the link to the rpm in your web browser, copy the link, get back to your shell and type wget and paste the link, this will download it to your machine. Next type rpm -i and the name of the downloaded file and this will install everything, apologies if that was embarrisingly basic but yum can be a bit hit and miss for me.
The documentation for installing Nagios is quite good but the documentation for actually using some of the many and various standard plugins is really quite sketchy so look outside the Nagios.org site for this information. The Nagios plugins page on sourceforge is your starting point for this but its not obvious.
In order to start monitoring services a little editing of the services.cfg is required followed by creating a couple of new hostgroups for similar machines. For example I wanted to monitor Mysql on 2 machines so I declared the service in services.cfg as follows
define service{
use generic-service
name mysql-service
is_volatile 0
check_period 24x7
max_check_attempts 5
normal_check_interval 1
retry_check_interval 1
notification_interval 20
notification_period 24x7
notification_options n
check_command check-mysql-alive
service_description MYSQL
contact_groups nerds
hostgroup_name mysql_servers
}
This will check the hostgroup mysql_servers every minute, 24 hours a day and email the nerd group if there is a problem for 5 successive queries. It presumes a standard install which predefines the 24x7 time period and the check-mysql-alive command in the realvent files.
Now that I have these extra services being monitored there is some really quite useful informtion being generated, for example the number of connections connected to the MySQL servers and the response times for the HTTP servers. Another area which is being fine tuned via the timegroups.cfg file is when I want to be alerted about certain things, for example I quite like getting an email if a router goes down overnight but as some equipment is turned off overnight I dont particularly need to know. So in short just getting Nagios installed is the tip of the iceberg, the more you think about things the more instances where good network monitoring is useful become apparent. The good news is that this fine tuning is very quick and easy once you have the thing up and running.
One final point on Nagios before everyone gets bored is that on windows you can actually use active desktop to embed your live network map into your desktop making sure you never miss a trick :o) Simply go to desktop properties->customise and paste your Nagios address followed by /cgi-bin/statusmap.cgi?host=all into a new web address to embed into the desktop.
Some other things happening in our little team include a spontaneous upgrade to Adobe CS3, I haven't even got it installed yet so you will all have to wait for some views and opinion but one thing to remember is that you need bags of hard disk space, the download is about 1.5 GB and then it unpacks to nearer 2 GB then needs 5.6GB for the programs, so unless you want to be cleaning up after every stage you need about 10GB! Also the knowledge base is filling nicely and the more I use the product the more I like it, just one little point is that you have to keep going to different URL's to do different things, it does not check your security and give you all the options you are entitled to.
Having got the basics set up I spent some time last week getting the network diagram straight, as we have quite a busy network with 60 nodes I wanted to monitor, the automatic function did not do it justice. Moving on from this I fell foul of having installed the incorrect plugin archive, for some reason check_ping worked fine but I wanted to start monitoring DNS, MySQL and HTTP on a couple of servers and these plugins would not run. In order to test your plugins you can simply move to the plugins directory and type ./pluginname, for example ./check_ping --help will tell you what command line parameters the ping plugin requires then try it again with these supplied. To continue the example ./check_ping -H www.yahoo.com -w 1000,10% -c 1000,10%
returns PING OK - Packet loss = 0%, RTA = 84.63 ms. The commands are already set up for the standard plugins in commands.cfg if your install was ok.
So to sort this problem out I ran off to try my old mate DAG's archive :o) I found the appropriate rpm and bingo, everything works well. If your are relativly new to Linux as I am and you are struggling to connect to the repository on yum or using rpm there is a quick and dirty work around. Simply find the link to the rpm in your web browser, copy the link, get back to your shell and type wget and paste the link, this will download it to your machine. Next type rpm -i and the name of the downloaded file and this will install everything, apologies if that was embarrisingly basic but yum can be a bit hit and miss for me.
The documentation for installing Nagios is quite good but the documentation for actually using some of the many and various standard plugins is really quite sketchy so look outside the Nagios.org site for this information. The Nagios plugins page on sourceforge is your starting point for this but its not obvious.
In order to start monitoring services a little editing of the services.cfg is required followed by creating a couple of new hostgroups for similar machines. For example I wanted to monitor Mysql on 2 machines so I declared the service in services.cfg as follows
define service{
use generic-service
name mysql-service
is_volatile 0
check_period 24x7
max_check_attempts 5
normal_check_interval 1
retry_check_interval 1
notification_interval 20
notification_period 24x7
notification_options n
check_command check-mysql-alive
service_description MYSQL
contact_groups nerds
hostgroup_name mysql_servers
}
This will check the hostgroup mysql_servers every minute, 24 hours a day and email the nerd group if there is a problem for 5 successive queries. It presumes a standard install which predefines the 24x7 time period and the check-mysql-alive command in the realvent files.
Now that I have these extra services being monitored there is some really quite useful informtion being generated, for example the number of connections connected to the MySQL servers and the response times for the HTTP servers. Another area which is being fine tuned via the timegroups.cfg file is when I want to be alerted about certain things, for example I quite like getting an email if a router goes down overnight but as some equipment is turned off overnight I dont particularly need to know. So in short just getting Nagios installed is the tip of the iceberg, the more you think about things the more instances where good network monitoring is useful become apparent. The good news is that this fine tuning is very quick and easy once you have the thing up and running.
One final point on Nagios before everyone gets bored is that on windows you can actually use active desktop to embed your live network map into your desktop making sure you never miss a trick :o) Simply go to desktop properties->customise and paste your Nagios address followed by /cgi-bin/statusmap.cgi?host=all into a new web address to embed into the desktop.
Some other things happening in our little team include a spontaneous upgrade to Adobe CS3, I haven't even got it installed yet so you will all have to wait for some views and opinion but one thing to remember is that you need bags of hard disk space, the download is about 1.5 GB and then it unpacks to nearer 2 GB then needs 5.6GB for the programs, so unless you want to be cleaning up after every stage you need about 10GB! Also the knowledge base is filling nicely and the more I use the product the more I like it, just one little point is that you have to keep going to different URL's to do different things, it does not check your security and give you all the options you are entitled to.
Friday, 13 April 2007
A two pronged approach to vesion control
We had a little IT retreat earlier this week to discuss how we can more effectively work as a team, already this has spawned the knowledge base which we are diligently filling with guff, but another thing which became apparent is that we need tighter version control on the source code for the applications we are developing. The aim being to make it easier for several people to work on the applications together without constantly tripping over each other trying to edit the same files.
In a previous project myself and compadre Rob created a sports club management tool as a team and it very noticeably benefited from the contrasting styles and knowledge which was brought to bear on the task. We used subversion for source control on this project running on windows and although it was very useful it never quite delivered on all sides.
The reasons for this were mainly due to the focus of SVN being for text based source code and the merge principle of team work. For example, if two people work on the same text file simultaneously svn can very cleverly merge together the separate changes and 9 times out of 10 they will not conflict. Where the SVN system begins to come unstuck is when using non text based source files like Flash FLA files, as these use a proprietary format you cannot merge files if 2 people have simultaneously changed them, you immediately end up in conflict. So in this scenario you need to lock a file on the server when you are editing so that no one else can open it, the problem is that SVN is not very good at this, unless as can happen, I have missed something.
Given that our new systems are being developed in Flash with support from PHP files and a little MySQL thrown in I decided to look more closely at the version control on offer in Flash and Dreamweaver. It turns out that although the current system is very good at locking files using their 'Check In' 'Check Out' philosophy it is not quite so good at keeping an entire repository synchronised unless Dreamweaver is your weapon of choice. As each of us in the team prefer different HTML editors this will work very well for the Flash files but not for the project as a whole. According to the Adobe site the new version of CS3 has been greatly improved in this respect.
So the solution which seems to present itself here is in fact to use both systems in tandem with Flash taking care of its proprietary source files and subversion via tortoise in my case taking care of the text based files and having overall responsibility for the repository. Touch wood this seems to be working nicely but we have yet to get the whole team working on the project simultaneously.
If anyone else fancies having a go at this, installing subversion on a new virtual server is very straight forward and there are lots of good tutorials on the subject click here for the definitive guide for CentOS.
The Adobe site or the online help for Dreamweaver or Flash is the best place for information on how to use the current simple Macromedia version control.
And finally there is a short article here about fine tuning the setup when using both of these systems concurrently.
In a previous project myself and compadre Rob created a sports club management tool as a team and it very noticeably benefited from the contrasting styles and knowledge which was brought to bear on the task. We used subversion for source control on this project running on windows and although it was very useful it never quite delivered on all sides.
The reasons for this were mainly due to the focus of SVN being for text based source code and the merge principle of team work. For example, if two people work on the same text file simultaneously svn can very cleverly merge together the separate changes and 9 times out of 10 they will not conflict. Where the SVN system begins to come unstuck is when using non text based source files like Flash FLA files, as these use a proprietary format you cannot merge files if 2 people have simultaneously changed them, you immediately end up in conflict. So in this scenario you need to lock a file on the server when you are editing so that no one else can open it, the problem is that SVN is not very good at this, unless as can happen, I have missed something.
Given that our new systems are being developed in Flash with support from PHP files and a little MySQL thrown in I decided to look more closely at the version control on offer in Flash and Dreamweaver. It turns out that although the current system is very good at locking files using their 'Check In' 'Check Out' philosophy it is not quite so good at keeping an entire repository synchronised unless Dreamweaver is your weapon of choice. As each of us in the team prefer different HTML editors this will work very well for the Flash files but not for the project as a whole. According to the Adobe site the new version of CS3 has been greatly improved in this respect.
So the solution which seems to present itself here is in fact to use both systems in tandem with Flash taking care of its proprietary source files and subversion via tortoise in my case taking care of the text based files and having overall responsibility for the repository. Touch wood this seems to be working nicely but we have yet to get the whole team working on the project simultaneously.
If anyone else fancies having a go at this, installing subversion on a new virtual server is very straight forward and there are lots of good tutorials on the subject click here for the definitive guide for CentOS.
The Adobe site or the online help for Dreamweaver or Flash is the best place for information on how to use the current simple Macromedia version control.
And finally there is a short article here about fine tuning the setup when using both of these systems concurrently.
Labels:
Dreamweaver,
Subversion,
SVN,
Tortoise,
Verson Control
Wednesday, 11 April 2007
The IT Brain Dump
One thing we have always struggled with in our little IT department is the sharing of important information. If I set up a new system I might note the details in a book or even on a shared document but we have never quite found a system which works for us all and as a result we cannot always put our hands on other peoples knowledge quickly and easily. Last week we decided to have another go at organising our information again and whilst wading through the available knowledge management tools came across a couple of gems.
One system which came top of the list on Google was an open source system called Twiki which I must say was my first choice for a while. They have some very big companies using the software and it looks like a very simple system, which in the tradition of wiki allows all users to contribute towards a knowledge base. I think my main gripe was that I wanted something which looked a bit more easily organised and more like an application than a simple website, although in Twiki's defence I did only look through the demo for a few minutes.
The product I found in the end was a very nice PHP application called PHPKB as in PHP Knowledge base. Although this would not be to every ones taste the fact that the application is available as a simple and very reasonable series of PHP pages suits our setup here perfectly. You have to have access to, or know how to setup, a web server and a MySQL database to serve this application but I did note that the company offer a free setup service and they do have a hosted option. We simply added a new virtual server to our main Virtual Host and had the system running in about an hour. The installation is quite simple and managing the system once its running is very straight forward, I have already started posting bits of information about systems and its surprising once you get started how many very important nuggets are stashed on emails and even in your head.
One of the other nice things about this particular system is that some categories of information can be public and some can be protected, so that in depth technical information can be cordoned off but some of the less technical information and tips can be made available to everyone in the organisation. As with all systems the usefulness is only going to become apparent when we have been using it for some time but I would say that with a little perseverance it has the potential to save an awful lot of time and stress.
On another point regular readers of 'AVFTR' will have noticed that someone actually commented on a post yesterday :o) In fact the nice gentleman concerned even Blogged about the Blog! I must add it to the Blogroll. I am now braced for a massive increase in traffic, I might call Blogspot to make sure they have capacity because in the last 4 hours of yesterday I had 30 visitors.
One system which came top of the list on Google was an open source system called Twiki which I must say was my first choice for a while. They have some very big companies using the software and it looks like a very simple system, which in the tradition of wiki allows all users to contribute towards a knowledge base. I think my main gripe was that I wanted something which looked a bit more easily organised and more like an application than a simple website, although in Twiki's defence I did only look through the demo for a few minutes.
The product I found in the end was a very nice PHP application called PHPKB as in PHP Knowledge base. Although this would not be to every ones taste the fact that the application is available as a simple and very reasonable series of PHP pages suits our setup here perfectly. You have to have access to, or know how to setup, a web server and a MySQL database to serve this application but I did note that the company offer a free setup service and they do have a hosted option. We simply added a new virtual server to our main Virtual Host and had the system running in about an hour. The installation is quite simple and managing the system once its running is very straight forward, I have already started posting bits of information about systems and its surprising once you get started how many very important nuggets are stashed on emails and even in your head.
One of the other nice things about this particular system is that some categories of information can be public and some can be protected, so that in depth technical information can be cordoned off but some of the less technical information and tips can be made available to everyone in the organisation. As with all systems the usefulness is only going to become apparent when we have been using it for some time but I would say that with a little perseverance it has the potential to save an awful lot of time and stress.
On another point regular readers of 'AVFTR' will have noticed that someone actually commented on a post yesterday :o) In fact the nice gentleman concerned even Blogged about the Blog! I must add it to the Blogroll. I am now braced for a massive increase in traffic, I might call Blogspot to make sure they have capacity because in the last 4 hours of yesterday I had 30 visitors.
Thursday, 5 April 2007
Nagios on Centos - a grudging union
Centos is one of the great network operating systems, it was developed by a group of people who saw Red Hat Enterprise version 4 had become super reliable but slightly bloaty, they exercised their rights under the GNU public license got the source of RHEL4, put it on a stairmaster and gave it to the people.
Likewise Nagios is a great open source network monitoring system, if you are a Linux user and run a network chances are you will have come across Nagios, short of forking out about £1000 it is in fact pretty much your only option. About 12 months ago I installed Nagios on Fedora and it was a breeze, even though Nagios is a very comprehensive system requiring lots of fiddly configuration, on Fedora if you follow the instructions you will succeed in getting going in about an hour.
Unfortunately given that Centos and Fedora have a common ancestry and are very similar, trying to install Nagios on Centos will drive you up the wall. Unless I have done something stupid without realising it, installing the system from RHEL4 RPM's seems to scatter the files from one end of the disk to the other and it takes lots of patience to track them all down and link everything up. My advice would be follow the instructions to the letter but if you don't find the files you are looking for don't be surprised. Click here for the main Nagios site, this post is not a guide to installing Nagios on Centos just an amendment to the install guide based upon my rather frustrating experience.
Just in case I forget or anyone else trips over this, the locations are as follows:
Config CFG files - /etc/nagios
Web interface files - /usr/share/nagios
Log files - /var/log/
CGI files - /usr/lib/nagios/cgi
A guy called Dag (??) has done some Centos RPM's but I couldn't subscribe to his repository, if you can it is quite possible that he has reworked the install to follow the instructions. Couldn't resist doing my 'Google Images' thing for Dag, turns out this Swedish guy is also comfortable going my the name Dag. There are some great translations for Dag on wikipedia, in Swedish it means 'Day' and in Turkish it refers to a 'Mountain'.
So now the dust has settled after our mammoth network rewire last week and Nagios is running sweetly I feel quite satisfied with everything. As expected we have had a few static routes crawl out of the woodwork and we have renewed our efforts to use DNS rather that IP addresses for routing around the network. It turns out reversing the VPN connections was not all that it promised and we have moved them all back again, it also seems that having Nagios running is actually very good for the stability of the VPN as the frequent pinging seems to keep the routers awake and the tunnels in good repair.
One job left to complete is to define a custom status map for Nagios, as we have over a hundred nodes on the network being monitored the auto generated map is a bit of a mess so I have to define the map by hand which is a bit of a pain. That said it will look very nice as we have purchased an icon library for our software development and their networking set is very sweet. See left for a sneak peek, note however that our main managed switches are not down it is just that Netgear have issued a firmware upgrade they are short of. One day I would love to do a more comprehensive Flash front-end to Nagios but frankly right now I have better things to do.
Another job I think would pay dividends would be to set up a secondary DNS server at Manchester, it is probably quite straight forward but I think I will let the dust settle before attempting this one.
Likewise Nagios is a great open source network monitoring system, if you are a Linux user and run a network chances are you will have come across Nagios, short of forking out about £1000 it is in fact pretty much your only option. About 12 months ago I installed Nagios on Fedora and it was a breeze, even though Nagios is a very comprehensive system requiring lots of fiddly configuration, on Fedora if you follow the instructions you will succeed in getting going in about an hour.
Unfortunately given that Centos and Fedora have a common ancestry and are very similar, trying to install Nagios on Centos will drive you up the wall. Unless I have done something stupid without realising it, installing the system from RHEL4 RPM's seems to scatter the files from one end of the disk to the other and it takes lots of patience to track them all down and link everything up. My advice would be follow the instructions to the letter but if you don't find the files you are looking for don't be surprised. Click here for the main Nagios site, this post is not a guide to installing Nagios on Centos just an amendment to the install guide based upon my rather frustrating experience.
Just in case I forget or anyone else trips over this, the locations are as follows:
Config CFG files - /etc/nagios
Web interface files - /usr/share/nagios
Log files - /var/log/
CGI files - /usr/lib/nagios/cgi
A guy called Dag (??) has done some Centos RPM's but I couldn't subscribe to his repository, if you can it is quite possible that he has reworked the install to follow the instructions. Couldn't resist doing my 'Google Images' thing for Dag, turns out this Swedish guy is also comfortable going my the name Dag. There are some great translations for Dag on wikipedia, in Swedish it means 'Day' and in Turkish it refers to a 'Mountain'.
So now the dust has settled after our mammoth network rewire last week and Nagios is running sweetly I feel quite satisfied with everything. As expected we have had a few static routes crawl out of the woodwork and we have renewed our efforts to use DNS rather that IP addresses for routing around the network. It turns out reversing the VPN connections was not all that it promised and we have moved them all back again, it also seems that having Nagios running is actually very good for the stability of the VPN as the frequent pinging seems to keep the routers awake and the tunnels in good repair.
One job left to complete is to define a custom status map for Nagios, as we have over a hundred nodes on the network being monitored the auto generated map is a bit of a mess so I have to define the map by hand which is a bit of a pain. That said it will look very nice as we have purchased an icon library for our software development and their networking set is very sweet. See left for a sneak peek, note however that our main managed switches are not down it is just that Netgear have issued a firmware upgrade they are short of. One day I would love to do a more comprehensive Flash front-end to Nagios but frankly right now I have better things to do.
Another job I think would pay dividends would be to set up a secondary DNS server at Manchester, it is probably quite straight forward but I think I will let the dust settle before attempting this one.
Wednesday, 4 April 2007
Some Excellent Manipulation
An interesting little job came up yesterday which involved formatting data on an excel spreadsheet, we have some lists which have to look pretty but are edited frequently and we were having to spend a lot of time ensuring that these lists had a reliable and consistent format. Lots of ideas spring to mind for a job like this and the temptation is to go for yet another little database application but in this case it really felt like it would be overkill.
The solution which appears to have legs is to create a rather niffty excel parser using a couple of useful PHP add ons. For those of you who don't know what a parser is the definition on wikipedia is "the process of analyzing a sequence of tokens to determine its grammatical structure", in lay mans terms think of it a s a digester of documents. You push one in one end and it reads it, digests it and magically supplies a result, or in this case a completely reformatted spreadsheet. This will allow us to keep our data in a very simple unformatted spreadsheet, but by running them through our new system, we can have a nicely formatted consistent look ready to print in a click. Its all summed up nicely by another of my random dips into Google images, this time for the word "parse" see image right.
So if anyone ever has a need of such a beast or, more likely, if in 6 months time I have forgotten what I did and need a reference, the 2 places to go are PEAR for the excel spreadsheet writer add on and Source forge for the Excel Reader add on. When these are installed and working individually there is no reason why they cannot be used in the same PHP script in a push-me pull-you sort of fashion. It only took a couple of hours to get a basic system running and you can even allow the user to specify some parameters with their spreadsheet. So for example you pass a raw sheet of data, a title, a font and a relative font size and the parser running through the writer will apply sizing's and fonts in defined ways to different columns of data, it will even specify margins and printing areas so the document is completely ready to roll.
Keep tuning in for the definitive guide to installing Nagios on Centos without 'going postal' later this week.
The solution which appears to have legs is to create a rather niffty excel parser using a couple of useful PHP add ons. For those of you who don't know what a parser is the definition on wikipedia is "the process of analyzing a sequence of tokens to determine its grammatical structure", in lay mans terms think of it a s a digester of documents. You push one in one end and it reads it, digests it and magically supplies a result, or in this case a completely reformatted spreadsheet. This will allow us to keep our data in a very simple unformatted spreadsheet, but by running them through our new system, we can have a nicely formatted consistent look ready to print in a click. Its all summed up nicely by another of my random dips into Google images, this time for the word "parse" see image right.
So if anyone ever has a need of such a beast or, more likely, if in 6 months time I have forgotten what I did and need a reference, the 2 places to go are PEAR for the excel spreadsheet writer add on and Source forge for the Excel Reader add on. When these are installed and working individually there is no reason why they cannot be used in the same PHP script in a push-me pull-you sort of fashion. It only took a couple of hours to get a basic system running and you can even allow the user to specify some parameters with their spreadsheet. So for example you pass a raw sheet of data, a title, a font and a relative font size and the parser running through the writer will apply sizing's and fonts in defined ways to different columns of data, it will even specify margins and printing areas so the document is completely ready to roll.
Keep tuning in for the definitive guide to installing Nagios on Centos without 'going postal' later this week.
Subscribe to:
Posts (Atom)
A view from the rack is the personal blog of an IT manager who works for a pub company - hence beer