Hedgehog Tools Guide 2.4.0 The Hedgehog tools are a set of scripts that are used to perform various tasks in terms of creating and maintaining the database and hedgehog files on disk. They can be run directly from /libexec/hedgehog or for convenience a wrapper script (/bin/hedgehogctl) is provided: >hedgehogctl help Run a Hedgehog command Usage: hedgehogctl [help|list] [COMMAND] [command_options]

The available hedgehogctl commands are described below Data Manager Database database_init database_manage_partitions database_process_rssac_data database_rm_old_rssac_data database_update_geoip database_update_nodes database_update_service_addr database_update_tlds_from_zone Datafiles datafiles_create_summary datafiles_rm_empty_xml_dirs datafiles_tar_old_xml_files Web front-end Plotcache plotcache_generate_cached_plots plotcache_rm_cached_plots RSSAC rssac_generate_reports Other tools conf_read

Data Manager These scripts should be run as the DB_OWNER user (hedgehog by default).

Database database_init The script should be run after the database has been created (see the /libexec/database_create helper script) This script populates the database with the plot and server/node data Create and populate the database tables with plot options and nodes. This calls the hedeghog_database_update_nodes.sh script which will populate the DB with the specified nodes and create a directory structure for them in the working data directory. Usage: database_init options Supported options: -m Start month of oldest data to be imported (default is current month, format YYYY-MM) -h Show this help

database_manage_partitions This script creates new tables in the database to hold the imported data.

It must be run at least once a month to create the tables for next month or the import will fail. It is recommended to configure a cron job as described in the Cron jobs section of the Installation guide. Otherwise, the user will only have to run this script when new servers are added to an existing system. Create a new set of partitioned data tables in the database to hold imported data. The partitions are created per month of data. Usage: database_manage_partitions options Supported options: -m Month to create partitions for (default is next month, format YYYY-MM) -h Show this help

database_process_rssac_data This script should be run to process the data ready for the rssac_process_data script. The importer inserts data from the XML files into the 'unique_source_raw' table. For each node, this script creates entries in the 'unique_source_summary' table based on the raw data and then (optionally) deletes the original raw data from the 'unique_source_raw' table. Process RSSAC data for a specified 24 hour period Usage: database_process_rssac_data options Supported options: -d Date on which to process RSSAC data (default is 7 days ago, format: YYYY-MM-DD) -f Force overwrite of existing summary data -D Also delete the raw unique_source data from the database to save space (default: no delete) -h Show this help

database_rm_old_rssac_data For some systems, the unique source raw data can be unmanageably large. This script can be used to truncate the database tables to recover the disk space, for example it can be run once a month to remove all the raw data for the previous month Delete all the raw unique_source data in the database for a given month. All data older than this is also deleted. This script is intended to be run monthly to free disk space on systems where the raw unique sources data becomes unmanagably large. Usage: database_rm_old_rssac_data options Supported options: -m Month from which to delete (default 2014-07: format (default is 2 months ago 1 month ago when in the -h Show this help

all raw unique source data YYYY-MM) when in the first half of the current month and second half of the current month)

database_update_geoip This script downloads a reference database of locations for use by the GEO maps. It should be periodically run to update with new location information.

Update the GeopIP list in the database. This reads input from /tmp Usage: database_update_geoip options Supported options: -h Show this help.

database_update_nodes Run this script if new nodes or servers are added to the system Add/update the servers and nodes in the database. This reads input from file called nodes.csv in the /usr/local/etc/hedgehog directory. - An example nodes.csv file is installed if one does not exist. - See the comments in that file for details of the format. - No action is taken on servers/nodes that are in the database but are not in the input files. Usage: /usr/local/bin/hedgehog_database_update_nodes options Supported options: -r rename/remove flag

This enables processing of the rename_or_remove field in nodes.csv If this flag is not given any lines in the input file that have text in the rename_or_remove field are ignored. Renaming a node will update the database and move the data directory to the new name. Removing a node will delete all associated data in the database but leave any data on disk untouched (for manual removal). Removing nodes could take a long time if you have a lot of data.

-d debug mode (set -x). -h Show this help.

database_update_service_addr Run this script to control which server addresses are displayed in the 'By server IP address' plot. This can be useful to filter to just the server service addresses since DSC does not have such a fitler. Update the Server service addreses list in the database. This list is only used to filter which addresses are displayed in the 'By server IP address' plot. If there are any addresses in the list then only these addresses are displayed. If there are no entries in the list then all the addresses for which there is data are displayed. By default this script reads the /opt/hedgehog/etc/hedgehog/service_addr.csv file and adds the addresses found there. After running this script it is advisable to remove any cached plots for the 'By server IP address' plot by using the plotcache_rm_cached_plots script. Usage: database_update_service_addr options Supported options: -d Delete all existing entries in the server service address list -h Show this help.

database_update_tlds_from_zone Run this script to update the categorisation of TLDs in the database from the root zone held in the IANA database.

Update the TLD list in the database. This reads input from IANA ftp site from the directory. No action is taken on TLDs that are in the database but are not in the input data. Usage: database_update_tlds_from_zone options Supported options: -h Show this help.

Datafiles datafiles_create_summary This is a useful utility script that can be used to get a high level picture of the processing of the XML files for all the nodes in the system. It can help identify backlogs in the processing system. Generate a processing report based on the information in the log files Usage: datafiles_create_summary options Supported options: -s Only report on the server with this name (default is all servers) -n Only report on the node with this name (default is all nodes) -c Output in csv format for import into spreadsheet application -d Run in debug mode (set -x) -h Show this help.

datafiles_rm_empty_xml_dirs Remove empty incoming xml directories after a certain period of time (default is all the processed directories older than 7 days ago) Usage: datafiles_rm_empty_xml_dirs options Supported options: -d Date before which removing empty xml directories (default: 7 days ago, format YYYY-MM-DD) -h Show this help

datafiles_tar_old_xml_files Pack old xml files already processed into a .tar package to store and archive. The xml files are packed according to their date and node and server. The filename has the following format done-PID-date.tar.bz2 (e.g. done-14143-2015-05-01.tar.bz2). Usage: datafiles_tar_old_xml_files options Supported options: -h Show this help.

Web front-end These scripts should be run as the DB_READ_USER user (www-data by default).

Plotcache

plotcache_generate_cached_plots You must add at least one option for which plot window to generate [-D|-W|-M]. By default, the cached plots are generated daily for the day before between 00:00 and 23:59. It can be helpful to configure a daily cron job to use this script to create cached plots to make loading of the homepage and common plots faster. If not relying on the default behaviour be careful to specify BOTH a start and end date for this script, otherwise the end date will default to today and this may result in many plots being generated.

Create cached plots for each chosen period (day/week/month) between Start and End range of time. Usage: generate_cached_plots options Supported options: -s Start date of time range from which to create cached plots (default is 1 day before at 00:00, format YYYY-MM-DDThh:mm) -e End date of time range from which to create cached plots (default is 1 day before at 23:59, format YYYY-MM-DDThh:mm) -D Generate daily cached plots -W Generate weekly cached plots -M Generate monthly cached plots -h Show this help

plotcache_rm_cached_plots The user should not normally need to run this script, but is available in case problem are encountered with cached plots, or it is desirable to flush the cache for some reason (for example the default interactive plot type is changed after graphs have been cached). This script is interactive if the -a option is not selected. You will be asked to select which cached plot types you'd like to remove. Remove all or a set of cached plots from the directory structure. Usage: plotcache_rm_cached_plots options Supported options: -a Remove all cached plots -h Show this help

RSSAC rssac_generate_reports By default the script creates RSSAC reports for all configured servers for a date 7 days in the past It will require that the database_process_rssac_data script will have already been run for the date in question. Note the unique_source data for the report is taken from the summary table, not the raw table. Create RSSAC reports for a specified 24 hour period Usage: rssac_generate_reports options Supported options: -d Date on which to create RSSAC reports (default is 7 days ago, format: YYYY-MM-DD) -h Show this help

Other tools conf_read The script parses the contents of the /etc/hedgehog/hedgehog.yaml file into environment variables for use my the other

scripts. Users will not need to run this directly. The database parameters are to be set by the user (if default ones need to be modified) and the directory structure parameters are auto-generated and better left alone.

Hedgehog Tools Guide 2.4.0 - GitHub

The Hedgehog tools are a set of scripts that are used to perform various tasks in terms of creating and maintaining the database and hedgehog files on disk. They can be run directly from for convenience a wrapper script (. ) is provided: /libexec/hedgehog or. /bin/hedgehogctl. >hedgehogctl help. Run a Hedgehog command.

19KB Sizes 1 Downloads 242 Views

Recommend Documents

Hedgehog User Guide 2.4.0 - GitHub
Hedgehog can be installed to run on a single server with all the components local to that server. .... They are documented separately in the Hedgehog Tools PDF. ... RSSACD. In order to collect the zone-size and load-time statistics a dedicated.

Hedgehog Upgrade Guide: 2.2.0 to 2.3.0 - GitHub
2.3 Update the server service address table ... By default all the server addresses are displayed. ... All the entries in the table can be removed by running sudo -u ...

McLab tools on the web - GitHub
highlighting. ➔ Message terminal. ➔ API for code highlighting using analysis results ... React.js. UI library built by Facebook https://facebook.github.io/react/ ...

McLab tools on the web - GitHub
Jan 6, 2016 - tools developed under the McLab project. This application is explicitly .... library developed by Facebook[5], and the Flux architecture pattern that complements React's composable. 4 ... Another option is to instead of selectively chan

MultiMarkdown User's Guide - GitHub
Nov 9, 2010 - for Markdown's syntax is the format of plain text email. [1] ... including complete XHTML documents, LaTeX, PDF, RTF, or even (shudder) Microsoft ... Also, you can check out the MultiMarkdown discussion list: ...... At this time, Scrive

Integrator's Guide - GitHub
Oct 20, 2015 - The Ethernet communication is handled by a dedicated .... The telnet server is not configured to echo characters, so users wishing to see and/or ...

user guide - GitHub
TOOLS AnD EVA ITEMS CAn BE FOUnD In A nEW TAB UnDER SCIEnCE CATEGORy. .... But THE greatest thing above all is KSP community. ... Of course, we still need hard work to improve our mods and we have many other ideas as.

Installation Guide - GitHub
Create the database tables. 3.2.5. (Optional) ... hedgehog Data Manager This is the user that will own the database created by. Hedgehog .... link on Homepage.

porting guide - GitHub
Mar 22, 2011 - This document describes the process of porting applications from ... Our development philosophy with BamTools so far has been to ... bool LocateIndex(const BamIndex::IndexType& preferredType = BamIndex::STANDARD);.

RVTPO User's Guide - GitHub
anyone, and any GitHub user can file issues or follow discussions related to the model software. Code in ... Because the repository is currently private, you may be prompted for your GitHub user name and password. ... The RVTPO model uses CUBE's cata

Public agent 240
The Chronicles ofNarnia:Prince Caspian bluray.831622661.The quarefellow.Real housewives of beverly hills s06*.深入解析android 5.0系统. pdf.Movie birdman 2014.Gamered 3 pc. Need for Speed:Hot Pursuit (2010).Recuvafilerecovery.Looking for Eric 20

Lipar_56.239-240.pdf
Loading… Whoops! There was a problem loading more pages. Whoops! There was a problem previewing this document. Retrying... Download. Connect more apps... Try one of the apps below to open or edit this item. Lipar_56.239-240.pdf. Lipar_56.239-240.pd

RVTPO User's Guide - GitHub
Users can download a PDF of the contents from the footer .... The scenario manager, in the image below, shows all of the scenarios that are included in the local Cube ..... These speed ratios were borrowed from the Olympus model in Florida. .... Spec

Pawn Implementor's Guide - GitHub
or send a letter to Creative Commons, 559 Nathan Abbott Way, Stanford, California 94305, USA. ...... 3. call the public function with the "AMX address" */.

Development Guide - GitHub
Development Guide. A basic understanding of Git is required ... (3400 and 3500). All changes should build and boot Linux on all the targets described in the wiki.

User Guide - GitHub
Requires the query, phrase, or word on its right hand side to not be in the document. [ATTRIBUTE]:. Requires the value of the document attribute describe between the brackets [ ] to equal the value to the right of the colon. Multiword phrases are exp

MIOpen Porting Guide - GitHub
cudnnCreateFilterDescriptor ( substituted by the respective. cudnnFilterDescriptor_t. TensorDescriptor APIs. * filterDesc). cudnnStatus t miopenstatus t. cudnnCreateConvolutionDescriptor (c miopenCreateConvolutionDescriptor. udnnConvolutionDescriptor

WinFred User's Guide - GitHub
May 5, 2017 - This website is intended to serve as a user's guide for application of the Winchester Frederick County. Metropolitan Planning Organization (WinFred) travel demand model. It provides relevant information necessary to understand how the m

Developer's Guide - GitHub
Oct 17, 2003 - The JTS Topology Suite is a Java API that implements a core set of spatial data operations using an explicit precision model and robust geometric algorithms. It provides a complete model for specifying 2-D linear Geometry. Many common

User Guide - GitHub
2.2 Download and Installation via App Manager . .... Cytoscape/GEXF “app” that allows network diagrams described using the GEXF file format to be imported ...

MultiMarkdown User's Guide - GitHub
Nov 9, 2010 - best description of what Markdown is comes from John Gruber's Markdown web site: ... including complete XHTML documents, LaTeX, PDF, RTF, or even (shudder) Microsoft ... In a blosxom8, Movable Type9, Oddmuse10, or other web site ......

Nipype Beginner's Guide - GitHub
Aug 23, 2017 - 10. 11. #Specify experiment specifc parameters. 12 experiment_dir .... For a full list of software interfaces supported by Nipype go here ...... (http://www.fil.ion.ucl.ac.uk/spm/software/spm12/SPM12_Release_Notes.pdf) mention some im-

Woptic User's Guide - GitHub
errors may become large for the mixed transitions governed by Wαβ(R, ω), where ..... .kcontribw_band. kanalysis reads this file and generates 2d data in ω- and.