Integration with a Customer Portal
Traffic Sentinel's scripted query capability provides a flexible mechanism for extracting traffic information that can be presented to network users. In many cases there may be an existing "customer portal" that provides access to information on network outages, scheduled maintenance and a mechanism for raising trouble tickets. This tutorial describes techniques for making traffic information accessible to customers in a secure and scalable way. This tutorial assumes that you are familiar with the script API (see the Scripting Queries tutorial). Some of the examples in this tutorial require Traffic Sentinel version 2.0.33 or later.
Note: Traffic Sentinel's web interface is designed for network managers and operators. Traffic Sentinel's abililty to provide a detailed, network-wide view of traffic is very useful for network operations but acces to this sensitive information should be limited. In order to provide information to customers, you need to ensure that there is no leakage of confidential information between customers, or leakage of information about the internal configuration of the network that would compromise security.
There are two approaches to providing data to customers:
- Standard Reports, create a standard set of charts and tables that describes each customer's traffic.
- Ad-Hoc Queries, provide a way for customers to make queries to selected parts of the database.
Standard Reports
The first task is to find a way to identify network and/or traffic objects that are associated with each customer. This association will tend to be network dependent:
- switch port, in many networks customers are associated with the interface(s) on the edge switches that connect their servers to the network. This is often the case with colocation factilities.
- mac address, the mac address(es) associated with customer routers may identify customers. This can be the case with a layer 2 exchange.
- CIDR, customers might be allocated particular IP addresses or address blocks.
- file, dump the information as a file in a location accessible from the Traffic Sentinel server.
- url, make the information available through a web server. The information could be created dynamically through a php or CGI script, or it could simply be a way of exporting the file.
- script, you can invoke the query from within a Perl script and pass the information as part of the query.
In this example we will assume that the customer information is available as an xml document:
<customers> <customer name="InMon Corp." id="1001"> <port agent="10.0.0.1" ifName="ethernet5" /> <port agent="10.0.0.1" ifName="ethernet6" /> </customer> <customer name="NewCo Inc." id="1002"> <port agent="10.0.0.4" ifName="A4" /> </customer> </customers>In this document, each customer has a name and a unique customer identifier. Each customer has been allocated one or more switch ports. Each switch port is described by the switch IP address and the ifName of the interface.
Suppose that you want to create charts for each customer showing traffic trends for each of their interfaces. The following script (trends.txt) creates the charts:
var url = "http://www.inmon.com/tutorials2/customers.xml"; var rootdir = "/var/www/html/customers/"; var height = 200; var width = 400; // get customer data var customersString = readurl(url); var customers = new XML(customersString); var network = Network.current(); // create list of customer interfaces var ifs = new Array(); for each (var customer in customers..customer) { for each (var port in customer..port) { network.path = port.@agent + ">" + port.@ifName; var ifId = network.interfaceId(); if(ifId) ifs.push(ifId); } } // query for trends on each customer interface var query = Query.trend( "historycounters", "time,interface,rate(ifinoctets),rate(ifoutoctets)", "interface=" + ifs, "yesterday", 5); var table = query.run(); // split results by customer var trends = new Array(); var pvt = table.pivotTime(0,1,2); for(var j = 1; j < pvt.ncols; j++) { var t = new Table(); t.start = table.start; t.end = table.end; t.addColumn("Time","time",pvt.column(0)); t.addColumn("In","double",pvt.column(j)); t.scaleColumn(1,8); trends[pvt.cnames[j]] = t; } pvt = table.pivotTime(0,1,3); for(var j = 1; j < pvt.ncols; j++) { var t = trends[pvt.cnames[j]]; t.addColumn("Out","double",pvt.column(j)); } // create charts for each customer for each (var customer in customers..customer) { var n = 1; for each (var port in customer..port) { network.path = port.@agent + ">" + port.@ifName; var ifId = network.interfaceId(); if(ifId) { var t = trends[ifId]; var chart = Chart.multiSeries( "trend", "Interface:" + n++, t, "Time", 0, "Bits per Second", [1,2]); chart.height = height; chart.width = width; write( rootdir + customer.@id + "/" + port.@agent + "-" + port.@ifName + ".png", chart); } else println( "cannot find interface, agent=" + port.@agent + " ifName=" + port.@ifName); } }
Most reporting scripts will have a similar structure, following the same basic steps:
- Retrieve customer data
- Create a filter to select only the data relevant to the customers
- Run a query to retrieve the data
- Split the resulting data into individual customer tables
- Create charts for each customer and write them to disk
The following steps use the crontab command to automatically run this script every day at 2:00am to update the customer charts. Type:
crontab -eand add the following line to your list of tasks:
0 2 * * * /usr/local/inmsf/bin/query /home/pp/scripts/trends.txtYou may want to construct multiple scripts, each of which updates a different type of chart. A script run every 10 minutes might update trend charts showing data over the last hour. A script run every month could provide charts that display top contributors to usage charges.
At this point we have a family of charts being maintained automatically. Each customer's charts are stored in a directory corresponding to their customer ID. The next step is to decide how to make this data available to the customer portal.
There are a number of options for exporting the data:
- NFS you can use an NFS mount to either write the data directly onto the customer portal server, or to allow the customer portal server to access the customer data. This technique is very convenient if Traffic Sentinel server and the customer portal server are near each other on a secure network.
- rsync you can use the rsync utility to copy updated information to the customer portal. Rsync can be used in conjunction with ssh to provide a secure mechanism for transferring the data between the servers.
- HTTP you can configure Apache on the Traffic Sentinel server to export the data so that it is available via HTTP (or HTTPS if security is a concern). In a simple case where there is no existing customer portal, you could construct a simple web interface that allows users to see their traffic data through pages published on the Traffic Sentinel server.
In the next section we will look at techniques for allowing customers to make queries that access their own traffic data.
Ad-Hoc Queries
Ad-Hoc queries are useful for providing information that is needed immediately or information that is only occasionally accessed.
In this example, suppose you want to allow customers to make a query that would list all the hosts that connected to a particular IP address during a specified period. This type of query provides a useful audit trail to the customer in the case where they suspect a host might have been compromised.
In this case, assume that instead of being associated with switch ports, each customer is associated with one or more IP address blocks:
<customers> <customer name="InMon Corp." id="1001"> <cidr address="10.1.1.0" bits="24" /> <cidr address="10.1.2.0" bits="24" /> </customer> <customer name="NewCo Inc." id="1002"> <cidr address="10.1.4.0" bits="25" /> </customer> </customers>
The following script (connections.txt) allows a customer to query their data:
var url = "http://www.inmon.com/tutorials2/customers_cidr.xml"; // get customer data var customersString = readurl(url); var customers = new XML(customersString); // get cidrs for customerid var cidrs = new Array(); for each(var cidr in customers.customer.(@id == customerid)..cidr) { cidrs.push(cidr.@address + "/" + cidr.@bits); } if(cidrs.length > 0) { var where = "(ipsource=" + cidrs + "|" + "ipdestination=" + cidrs + ")"; where += "&(ipsource=" + target + "|" + "ipdestination=" + target + ")"; println(where); var query = Query.topN( "historytrmx", "sourceaddress,sourceport,destinationaddress,destinationport,bytes", where, "today", "bytes", 100); var t = query.run(); t.printHTML(true); }
Finally, the script needs to be integrated with the customer portal. Assuming that the portal is written in php, here is an example script:
<?php $customerid = '1001'; $target = $_GET['target']; if($target) { exec('/usr/local/inmsf/bin/query\ -input customerid='.$customerid.'\ -input target='.escapeshellarg($target).'\ /home/pp/scripts/connections.txt',$result); foreach($result as $line) echo($line); } else { echo '<form method="GET" action="'.$PHP_SELF.'">'; echo '<input type="text" size="12" name="target">'; echo '<button type="submit">OK</button>'; echo '</form>'; } ?>
The customer needs be authenticated before they have access to the query form. The variable $customerid must be set to the authenticated customer's id.
This example demonstrates the basic elements of a customer facing query:
- Authenticate the customer
- Extract query parameters
- Construct a filter that restricts customer to their own data
- Append additional customer supplied filters
- Run query
- Format the results
Related Topics |
|