https://bugs.webkit.org/show_bug.cgi?id=141860
Reviewed by Andreas Kling.
This patch introduces the caches of JSON files returned by /api/ in /data/ directory. It also records
the last time test_runs rows associated with the requested platforms and metrics are inserted, updated,
or removed in the caches as well as the manifest JSON files ("last modified time"). Because the manifest
is regenerated each time a new test result is reported, the front end can compare last modified time in
the manifest file with that in a /api/runs JSON cache to detect the stale-ness.
More concretely, the front end first optimistically fetches the JSON in /data/. If the cache doesn't exit
or the last modified time in the cache doesn't match with that in the manifest file, it would fetch it
again via /api/runs. In the case the cache did exist, we render the charts based on the cache meanwhile.
This dramatically reduces the perceived latency for the page load since charts are drawn immediately using
the cache and we would only re-render the charts as new up-to-date JSON comes in.
This patch also changes the format of runs JSONs by pushing the exiting properties into 'configurations'
and adding 'lastModified' and 'elapsedTime' at the top level.
* init-database.sql: Added config_runs_last_modified to test_configurations table as well as a trigger to
auto-update this column upon changes to test_runs table.
* public/admin/test-configurations.php:
(add_run): Regenerate the manifest file to invalidate the /api/runs JSON cache.
(delete_run): Ditto.
* public/api/runs.php:
(main): Fetch all columns of test_configurations table including config_runs_last_modified. Also generate
the cache in /data/ directory.
(RunsGenerator::__construct): Compute the last modified time for this (platform, metric) pair.
(RunsGenerator::results): Put the old content in 'configurations' property and include 'lastModified' and
'elapsedTime' properties. 'elapsedTime' is added for debugging purposes.
(RunsGenerator::add_runs):
(RunsGenerator::parse_revisions_array):
* public/include/db.php:
(CONFIG_DIR): Added.
(generate_data_file): Added based on ManifestGenerator::store.
(Database::to_js_time): Extracted from RunsGenerator::add_runs to share code.
* public/include/json-header.php:
(echo_success): Renamed from success_json. Return the serialized JSON instead of echo'ing it so that we can
generate caches in /api/runs/.
(exit_with_success):
* public/include/manifest.php:
(ManifestGenerator::generate): Added 'elapsedTime' property for the time taken to generate the manifest.
It seems like we're generating it in 200-300ms for now so that's good.
(ManifestGenerator::store): Uses generate_data_file.
(ManifestGenerator::platforms): Added 'lastModified' array to each platform entry. This array contains the
last modified time for each (platform, metric) pair.
* public/index.html:
(fetchTest): Updated per the format change in runs JSON.
* public/v2/app.js:
(App.Pane._fetch): Fetch the cached JSON first. Refetch the uncached version if instructed as such.
(App.Pane._updateChartData): Extracted from App.Pane._fetch.
(App.Pane._handleFetchErrors): Ditto.
* public/v2/data.js:
(RunsData.fetchRuns): Takes the fourth argument indicating whether we should fetch the cached version or not.
The cached JSON is located in /data/ with the same filename. When fetching a cached JSON results in 404,
fulfill the promise with null as the result instead of rejecting it. The only client of this function which
sets useCache to true is App.Manifest.fetchRunsWithPlatformAndMetric, and it handles this special case.
* public/v2/manifest.js:
(App.DateArrayTransform): Added. Handles the array of last modified dates in platform objects.
(App.Platform.lastModifiedTimeForMetric): Added. Returns the last modified date in the manifest JSON.
(App.Manifest.fetchRunsWithPlatformAndMetric): Takes "useCache" like RunsData.fetchRuns. Set shouldRefetch
to true if response is null (the cache didn't exit) or the cache is out-of-date.
(App.Manifest._formatFetchedData): Extracted from App.Manifest.fetchRunsWithPlatformAndMetric.
* run-tests.js:
(initializeDatabase): Avoid splitting function definitions in the middle.
* tests/api-report.js: Added tests to verify that reporting new test results updates the last modified time
in test_configurations.
git-svn-id: https://svn.webkit.org/repository/webkit/trunk@180468
268f45cc-cd09-0410-ab3c-
d52691b4dbfc
+2015-02-20 Ryosuke Niwa <rniwa@webkit.org>
+
+ Loading the perf dashboard takes multiple seconds
+ https://bugs.webkit.org/show_bug.cgi?id=141860
+
+ Reviewed by Andreas Kling.
+
+ This patch introduces the caches of JSON files returned by /api/ in /data/ directory. It also records
+ the last time test_runs rows associated with the requested platforms and metrics are inserted, updated,
+ or removed in the caches as well as the manifest JSON files ("last modified time"). Because the manifest
+ is regenerated each time a new test result is reported, the front end can compare last modified time in
+ the manifest file with that in a /api/runs JSON cache to detect the stale-ness.
+
+ More concretely, the front end first optimistically fetches the JSON in /data/. If the cache doesn't exit
+ or the last modified time in the cache doesn't match with that in the manifest file, it would fetch it
+ again via /api/runs. In the case the cache did exist, we render the charts based on the cache meanwhile.
+ This dramatically reduces the perceived latency for the page load since charts are drawn immediately using
+ the cache and we would only re-render the charts as new up-to-date JSON comes in.
+
+ This patch also changes the format of runs JSONs by pushing the exiting properties into 'configurations'
+ and adding 'lastModified' and 'elapsedTime' at the top level.
+
+ * init-database.sql: Added config_runs_last_modified to test_configurations table as well as a trigger to
+ auto-update this column upon changes to test_runs table.
+
+ * public/admin/test-configurations.php:
+ (add_run): Regenerate the manifest file to invalidate the /api/runs JSON cache.
+ (delete_run): Ditto.
+
+ * public/api/runs.php:
+ (main): Fetch all columns of test_configurations table including config_runs_last_modified. Also generate
+ the cache in /data/ directory.
+ (RunsGenerator::__construct): Compute the last modified time for this (platform, metric) pair.
+ (RunsGenerator::results): Put the old content in 'configurations' property and include 'lastModified' and
+ 'elapsedTime' properties. 'elapsedTime' is added for debugging purposes.
+ (RunsGenerator::add_runs):
+ (RunsGenerator::parse_revisions_array):
+
+ * public/include/db.php:
+ (CONFIG_DIR): Added.
+ (generate_data_file): Added based on ManifestGenerator::store.
+ (Database::to_js_time): Extracted from RunsGenerator::add_runs to share code.
+
+ * public/include/json-header.php:
+ (echo_success): Renamed from success_json. Return the serialized JSON instead of echo'ing it so that we can
+ generate caches in /api/runs/.
+ (exit_with_success):
+
+ * public/include/manifest.php:
+ (ManifestGenerator::generate): Added 'elapsedTime' property for the time taken to generate the manifest.
+ It seems like we're generating it in 200-300ms for now so that's good.
+ (ManifestGenerator::store): Uses generate_data_file.
+ (ManifestGenerator::platforms): Added 'lastModified' array to each platform entry. This array contains the
+ last modified time for each (platform, metric) pair.
+
+ * public/index.html:
+ (fetchTest): Updated per the format change in runs JSON.
+
+ * public/v2/app.js:
+ (App.Pane._fetch): Fetch the cached JSON first. Refetch the uncached version if instructed as such.
+ (App.Pane._updateChartData): Extracted from App.Pane._fetch.
+ (App.Pane._handleFetchErrors): Ditto.
+
+ * public/v2/data.js:
+ (RunsData.fetchRuns): Takes the fourth argument indicating whether we should fetch the cached version or not.
+ The cached JSON is located in /data/ with the same filename. When fetching a cached JSON results in 404,
+ fulfill the promise with null as the result instead of rejecting it. The only client of this function which
+ sets useCache to true is App.Manifest.fetchRunsWithPlatformAndMetric, and it handles this special case.
+
+ * public/v2/manifest.js:
+ (App.DateArrayTransform): Added. Handles the array of last modified dates in platform objects.
+ (App.Platform.lastModifiedTimeForMetric): Added. Returns the last modified date in the manifest JSON.
+ (App.Manifest.fetchRunsWithPlatformAndMetric): Takes "useCache" like RunsData.fetchRuns. Set shouldRefetch
+ to true if response is null (the cache didn't exit) or the cache is out-of-date.
+ (App.Manifest._formatFetchedData): Extracted from App.Manifest.fetchRunsWithPlatformAndMetric.
+
+ * run-tests.js:
+ (initializeDatabase): Avoid splitting function definitions in the middle.
+
+ * tests/api-report.js: Added tests to verify that reporting new test results updates the last modified time
+ in test_configurations.
+
2015-02-20 Ryosuke Niwa <rniwa@webkit.org>
REGRESSION(r180333): Analysis tasks can't be associated with bugs
CREATE TABLE repositories (
repository_id serial PRIMARY KEY,
+ repository_parent integer REFERENCES repositories ON DELETE CASCADE,
repository_name varchar(64) NOT NULL,
repository_url varchar(1024),
- repository_blame_url varchar(1024));
+ repository_blame_url varchar(1024),
+ CONSTRAINT repository_name_must_be_unique UNIQUE(repository_parent, repository_name));
CREATE TABLE bug_trackers (
tracker_id serial PRIMARY KEY,
config_platform integer NOT NULL REFERENCES platforms ON DELETE CASCADE,
config_type test_configuration_type NOT NULL,
config_is_in_dashboard boolean NOT NULL DEFAULT FALSE,
+ config_runs_last_modified timestamp NOT NULL DEFAULT (CURRENT_TIMESTAMP AT TIME ZONE 'UTC'),
CONSTRAINT configuration_must_be_unique UNIQUE(config_metric, config_platform, config_type));
CREATE INDEX config_platform_index ON test_configurations(config_platform);
iteration_relative_time float,
PRIMARY KEY (iteration_run, iteration_order));
+CREATE OR REPLACE FUNCTION update_config_last_modified() RETURNS TRIGGER AS $update_config_last_modified$
+ BEGIN
+ IF TG_OP != 'DELETE' THEN
+ UPDATE test_configurations SET config_runs_last_modified = (CURRENT_TIMESTAMP AT TIME ZONE 'UTC') WHERE config_id = NEW.run_config;
+ ELSE
+ UPDATE test_configurations SET config_runs_last_modified = (CURRENT_TIMESTAMP AT TIME ZONE 'UTC') WHERE config_id = OLD.run_config;
+ END IF;
+ RETURN NULL;
+ END;
+$update_config_last_modified$ LANGUAGE plpgsql;
+
+CREATE TRIGGER update_config_last_modified AFTER INSERT OR UPDATE OR DELETE ON test_runs
+ FOR EACH ROW EXECUTE PROCEDURE update_config_last_modified();
+
CREATE TABLE reports (
report_id serial PRIMARY KEY,
report_builder integer NOT NULL REFERENCES builders ON DELETE RESTRICT,
$db->commit_transaction();
notice("Added a baseline test run.");
+
+ regenerate_manifest();
}
function delete_run($run_id, $build_id) {
}
$db->commit_transaction();
+
+ regenerate_manifest();
}
if ($db) {
$platform_id = intval($parts[0]);
$metric_id = intval($parts[1]);
- $config_rows = $db->query_and_fetch_all('SELECT config_id, config_type, config_platform, config_metric
+ $config_rows = $db->query_and_fetch_all('SELECT *
FROM test_configurations WHERE config_metric = $1 AND config_platform = $2', array($metric_id, $platform_id));
if (!$config_rows)
exit_with_error('ConfigurationNotFound');
header("Cache-Control: maxage=$maxage");
}
- $generator = new RunsGenerator();
+ $generator = new RunsGenerator($config_rows);
foreach ($config_rows as $config) {
if ($test_group_id)
$generator->add_runs($config['config_type'], $raw_runs);
}
- exit_with_success($generator->results());
+ $content = success_json($generator->results());
+ if (!$test_group_id)
+ generate_data_file("$platform_id-$metric_id.json", $content);
+ echo $content;
}
class RunsGenerator {
- function __construct() {
+ function __construct($config_rows) {
$this->results = array();
+ $last_modified_times = array();
+ foreach ($config_rows as $row)
+ array_push($last_modified_times, Database::to_js_time($row['config_runs_last_modified']));
+ $this->last_modified = max($last_modified_times);
+ $this->start_time = microtime(true);
}
- function &results() { return $this->results; }
+ function results() {
+ return array(
+ 'configurations' => &$this->results,
+ 'lastModified' => $this->last_modified,
+ 'elapsedTime' => microtime(true) - $this->start_time);
+ }
function add_runs($name, $raw_runs) {
$formatted_runs = array();
'squareSum' => floatval($run['run_square_sum_cache']),
'revisions' => self::parse_revisions_array($run['revisions']),
'build' => $run['build_id'],
- 'buildTime' => strtotime($run['build_time']) * 1000,
+ 'buildTime' => Database::to_js_time($run['build_time']),
'buildNumber' => intval($run['build_number']),
'builder' => $run['build_builder']);
}
$name_and_revision = explode(',', trim($item, '()'));
if (!$name_and_revision[0])
continue;
- $time = strtotime(trim($name_and_revision[2], '"')) * 1000;
+ $time = Database::to_js_time(trim($name_and_revision[2], '"'));
$revisions[trim($name_and_revision[0], '"')] = array(trim($name_and_revision[1], '"'), $time);
}
return $revisions;
$_config = NULL;
+define('CONFIG_DIR', dirname(__FILE__) . '/../../');
+
function config($key) {
global $_config;
if (!$_config)
- $_config = json_decode(file_get_contents(dirname(__FILE__) . '/../../config.json'), true);
+ $_config = json_decode(file_get_contents(CONFIG_DIR . 'config.json'), true);
return $_config[$key];
}
+function generate_data_file($filename, $content) {
+ if (!assert(ctype_alnum(str_replace(array('-', '_', '.'), '', $filename))))
+ return FALSE;
+ return file_put_contents(CONFIG_DIR . config('dataDirectory') . '/' . $filename, $content);
+}
+
if (config('debug')) {
error_reporting(E_ALL | E_STRICT);
ini_set('display_errors', 'On');
return $value == 't';
}
+ static function to_js_time($time_str) {
+ return strtotime($time_str) * 1000;
+ }
+
function connect() {
$databaseConfig = config('database');
$this->connection = pg_connect('host=' . $databaseConfig['host'] . ' port=' . $databaseConfig['port']
exit(1);
}
-function echo_success($details = array()) {
+function success_json($details = array()) {
$details['status'] = 'OK';
merge_additional_details($details);
- echo json_encode($details);
+ return json_encode($details);
}
function exit_with_success($details = array()) {
- echo_success($details);
+ echo success_json($details);
exit(0);
}
}
function generate() {
+ $start_time = microtime(true);
+
$config_table = $this->db->fetch_table('test_configurations');
$platform_table = $this->db->fetch_table('platforms');
$repositories_table = $this->db->fetch_table('repositories');
'bugTrackers' => $this->bug_trackers($repositories_table),
'dashboards' => config('dashboards'),
);
- return $this->manifest;
+
+ $this->manifest['elapsedTime'] = (microtime(true) - $start_time) * 1000;
+
+ return TRUE;
}
function store() {
- return file_put_contents(self::MANIFEST_PATH, json_encode($this->manifest));
+ return generate_data_file('manifest.json', json_encode($this->manifest));
}
private function tests() {
if ($is_dashboard && !$this->db->is_true($config_row['config_is_in_dashboard']))
continue;
+ $new_last_modified = array_get($config_row, 'config_runs_last_modified', 0);
+ if ($new_last_modified)
+ $new_last_modified = strtotime($config_row['config_runs_last_modified']) * 1000;
+
$platform = &array_ensure_item_has_array($platform_metrics, $config_row['config_platform']);
- if (!in_array($config_row['config_metric'], $platform))
- array_push($platform, $config_row['config_metric']);
+ $metrics = &array_ensure_item_has_array($platform, 'metrics');
+ $last_modified = &array_ensure_item_has_array($platform, 'last_modified');
+
+ $metric_id = $config_row['config_metric'];
+ $index = array_search($metric_id, $metrics);
+ if ($index === FALSE) {
+ array_push($metrics, $metric_id);
+ array_push($last_modified, $new_last_modified);
+ } else
+ $last_modified[$index] = max($last_modified[$index], $new_last_modified);
}
}
+ $configurations = array();
+
$platforms = array();
if ($platform_table) {
foreach ($platform_table as $platform_row) {
if ($this->db->is_true($platform_row['platform_hidden']))
continue;
$id = $platform_row['platform_id'];
- if (array_key_exists($id, $platform_metrics))
- $platforms[$id] = array('name' => $platform_row['platform_name'], 'metrics' => $platform_metrics[$id]);
+ if (array_key_exists($id, $platform_metrics)) {
+ $platforms[$id] = array(
+ 'name' => $platform_row['platform_name'],
+ 'metrics' => $platform_metrics[$id]['metrics'],
+ 'lastModified' => $platform_metrics[$id]['last_modified']);
+ }
}
}
return $platforms;
return runs;
}
- $.getJSON('api/runs/' + filename, function (data) {
+ $.getJSON('api/runs/' + filename, function (response) {
+ var data = response.configurations;
callback(createRunAndResults(data.current), createRunAndResults(data.baseline), createRunAndResults(data.target));
});
}
else if (!this._isValidId(metricId))
this.set('failure', metricId ? 'Invalid metric id:' + metricId : 'Metric id was not specified');
else {
- var self = this;
-
- App.Manifest.fetchRunsWithPlatformAndMetric(this.get('store'), platformId, metricId).then(function (result) {
- self.set('platform', result.platform);
- self.set('metric', result.metric);
- self.set('chartData', result.data);
- self._updateMovingAverageAndEnvelope();
- }, function (result) {
- if (!result || typeof(result) === "string")
- self.set('failure', 'Failed to fetch the JSON with an error: ' + result);
- else if (!result.platform)
- self.set('failure', 'Could not find the platform "' + platformId + '"');
- else if (!result.metric)
- self.set('failure', 'Could not find the metric "' + metricId + '"');
- else
- self.set('failure', 'An internal error');
- });
-
+ var store = this.get('store');
+ var updateChartData = this._updateChartData.bind(this);
+ var handleErrors = this._handleFetchErrors.bind(this, platformId, metricId);
+ var useCache = true;
+ App.Manifest.fetchRunsWithPlatformAndMetric(store, platformId, metricId, null, useCache).then(function (result) {
+ updateChartData(result);
+ if (!result.shouldRefetch)
+ return;
+
+ useCache = false;
+ App.Manifest.fetchRunsWithPlatformAndMetric(store, platformId, metricId, null, useCache)
+ .then(updateChartData, handleErrors);
+ }, handleErrors);
this.fetchAnalyticRanges();
}
}.observes('platformId', 'metricId').on('init'),
+ _updateChartData: function (result)
+ {
+ this.set('platform', result.platform);
+ this.set('metric', result.metric);
+ this.set('chartData', result.data);
+ this._updateMovingAverageAndEnvelope();
+ },
+ _handleFetchErrors: function (platformId, metricId, result)
+ {
+ console.log(platformId, metricId, result)
+ if (!result || typeof(result) === "string")
+ this.set('failure', 'Failed to fetch the JSON with an error: ' + result);
+ else if (!result.platform)
+ this.set('failure', 'Could not find the platform "' + platformId + '"');
+ else if (!result.metric)
+ this.set('failure', 'Could not find the metric "' + metricId + '"');
+ else
+ this.set('failure', 'An internal error');
+ },
fetchAnalyticRanges: function ()
{
var platformId = this.get('platformId');
// FIXME: We need to devise a way to fetch runs in multiple chunks so that
// we don't have to fetch the entire time series to just show the last 3 days.
-RunsData.fetchRuns = function (platformId, metricId, testGroupId)
+RunsData.fetchRuns = function (platformId, metricId, testGroupId, useCache)
{
- var filename = platformId + '-' + metricId + '.json';
+ var url = useCache ? '../data/' : '../api/runs/';
+ url += platformId + '-' + metricId + '.json';
if (testGroupId)
- filename += '?testGroup=' + testGroupId;
+ url += '?testGroup=' + testGroupId;
return new Ember.RSVP.Promise(function (resolve, reject) {
- $.getJSON('../api/runs/' + filename, function (data) {
- if (data.status != 'OK') {
- reject(data.status);
+ $.getJSON(url, function (response) {
+ if (response.status != 'OK') {
+ reject(response.status);
return;
}
- delete data.status;
+ delete response.status;
+ var data = response.configurations;
for (var config in data)
data[config] = new RunsData(data[config]);
+
+ if (response.lastModified)
+ response.lastModified = new Date(response.lastModified);
- resolve(data);
+ resolve(response);
}).fail(function (xhr, status, error) {
- reject(xhr.status + (error ? ', ' + error : ''));
+ if (xhr.status == 404 && useCache)
+ resolve(null);
+ else
+ reject(xhr.status + (error ? ', ' + error : ''));
})
});
}
}.property('name', 'test'),
fullName: function ()
{
- return this.get('path').join(' \u220b ') /* ∋ */
+ return this.get('path').join(' \u220b ') /* ∈ */
+ ' : ' + this.get('label');
}.property('path', 'label'),
});
repositories: DS.hasMany('repository'),
});
+App.DateArrayTransform = DS.Transform.extend({
+ deserialize: function (serialized)
+ {
+ return serialized.map(function (time) { return new Date(time); });
+ }
+});
+
App.Platform = App.NameLabelModel.extend({
_metricSet: null,
_testSet: null,
metrics: DS.hasMany('metric'),
+ lastModified: DS.attr('dateArray'),
containsMetric: function (metric)
{
if (!this._metricSet)
this._metricSet = new Ember.Set(this.get('metrics'));
return this._metricSet.contains(metric);
},
+ lastModifiedTimeForMetric: function (metric)
+ {
+ var index = this.get('metrics').indexOf(metric);
+ if (index < 0)
+ return null;
+ return this.get('lastModified').objectAt(index);
+ },
containsTest: function (test)
{
if (!this._testSet) {
dashboards.forEach(function (dashboard) { self._dashboardByName[dashboard.get('name')] = dashboard; });
this._defaultDashboardName = dashboards.length ? dashboards[0].get('name') : null;
},
- fetchRunsWithPlatformAndMetric: function (store, platformId, metricId, testGroupId)
+ fetchRunsWithPlatformAndMetric: function (store, platformId, metricId, testGroupId, useCache)
{
+ Ember.assert("Can't cache results for test groups", !(testGroupId && useCache));
+ var self = this;
return Ember.RSVP.all([
- RunsData.fetchRuns(platformId, metricId, testGroupId),
+ RunsData.fetchRuns(platformId, metricId, testGroupId, useCache),
this.fetch(store),
]).then(function (values) {
- var runs = values[0];
+ var response = values[0];
var platform = App.Manifest.platform(platformId);
var metric = App.Manifest.metric(metricId);
- var suffix = metric.get('name').match('([A-z][a-z]+|FrameRate)$')[0];
- var unit = {
- 'FrameRate': 'fps',
- 'Runs': '/s',
- 'Time': 'ms',
- 'Malloc': 'bytes',
- 'Heap': 'bytes',
- 'Allocations': 'bytes'
- }[suffix];
- var smallerIsBetter = unit != 'fps' && unit != '/s'; // Assume smaller is better for unit-less metrics.
-
- var useSI = unit == 'bytes';
- var unitSuffix = unit ? ' ' + unit : '';
- var deltaFormatterWithoutSign = useSI ? d3.format('.2s') : d3.format('.2g');
return {
platform: platform,
metric: metric,
- data: {
- current: runs.current.timeSeriesByCommitTime(),
- baseline: runs.baseline ? runs.baseline.timeSeriesByCommitTime() : null,
- target: runs.target ? runs.target.timeSeriesByCommitTime() : null,
- unit: unit,
- formatWithUnit: function (value) { return this.formatter(value) + unitSuffix; },
- formatWithDeltaAndUnit: function (value, delta)
- {
- return this.formatter(value) + (delta && !isNaN(delta) ? ' \u00b1 ' + deltaFormatterWithoutSign(delta) : '') + unitSuffix;
- },
- formatter: useSI ? d3.format('.4s') : d3.format('.4g'),
- deltaFormatter: useSI ? d3.format('+.2s') : d3.format('+.2g'),
- smallerIsBetter: smallerIsBetter,
- }
+ data: response ? self._formatFetchedData(metric.get('name'), response.configurations) : null,
+ shouldRefetch: !response || +response.lastModified < +platform.lastModifiedTimeForMetric(metric),
};
});
},
+ _formatFetchedData: function (metricName, configurations)
+ {
+ var suffix = metricName.match('([A-z][a-z]+|FrameRate)$')[0];
+ var unit = {
+ 'FrameRate': 'fps',
+ 'Runs': '/s',
+ 'Time': 'ms',
+ 'Malloc': 'bytes',
+ 'Heap': 'bytes',
+ 'Allocations': 'bytes'
+ }[suffix];
+
+ var smallerIsBetter = unit != 'fps' && unit != '/s'; // Assume smaller is better for unit-less metrics.
+
+ var useSI = unit == 'bytes';
+ var unitSuffix = unit ? ' ' + unit : '';
+ var deltaFormatterWithoutSign = useSI ? d3.format('.2s') : d3.format('.2g');
+
+ return {
+ current: configurations.current.timeSeriesByCommitTime(),
+ baseline: configurations.baseline ? configurations.baseline.timeSeriesByCommitTime() : null,
+ target: configurations.target ? configurations.target.timeSeriesByCommitTime() : null,
+ unit: unit,
+ formatWithUnit: function (value) { return this.formatter(value) + unitSuffix; },
+ formatWithDeltaAndUnit: function (value, delta)
+ {
+ return this.formatter(value) + (delta && !isNaN(delta) ? ' \u00b1 ' + deltaFormatterWithoutSign(delta) : '') + unitSuffix;
+ },
+ formatter: useSI ? d3.format('.4s') : d3.format('.4g'),
+ deltaFormatter: useSI ? d3.format('+.2s') : d3.format('+.2g'),
+ smallerIsBetter: smallerIsBetter,
+ };
+ }
}).create();
var firstError;
var queue = new TaskQueue();
- commaSeparatedSqlStatements.split(/;\s*/).forEach(function (statement) {
+ commaSeparatedSqlStatements.split(/;\s*(?=CREATE|DROP)/).forEach(function (statement) {
queue.addTask(function (error, callback) {
client.query(statement, function (error) {
if (error && !firstError)
});
});
});
+
+ var reportsUpdatingDifferentTests = [
+ [{
+ "buildNumber": "123",
+ "buildTime": "2013-02-28T10:12:03",
+ "builderName": "someBuilder",
+ "builderPassword": "somePassword",
+ "platform": "Mountain Lion",
+ "tests": {"test1": {"metrics": {"Time": {"current": 3}}}}
+ }],
+ [{
+ "buildNumber": "124",
+ "buildTime": "2013-02-28T11:31:21",
+ "builderName": "someBuilder",
+ "builderPassword": "somePassword",
+ "platform": "Mountain Lion",
+ "tests": {"test2": {"metrics": {"Time": {"current": 3}}}}
+ }],
+ [{
+ "buildNumber": "125",
+ "buildTime": "2013-02-28T12:45:34",
+ "builderName": "someBuilder",
+ "builderPassword": "somePassword",
+ "platform": "Mountain Lion",
+ "tests": {"test1": {"metrics": {"Time": {"current": 3}}}}
+ }],
+ ];
+
+ function fetchTestConfig(testName, metricName, callback) {
+ queryAndFetchAll('SELECT * FROM tests, test_metrics, test_configurations WHERE test_id = metric_test AND metric_id = config_metric'
+ + ' AND test_name = $1 AND metric_name = $2', [testName, metricName], function (runRows) {
+ assert.equal(runRows.length, 1);
+ callback(runRows[0]);
+ });
+ }
+
+ it("should update the last modified date of test configurations with new runs", function () {
+ addBuilder(reportsUpdatingDifferentTests[0], function () {
+ postJSON('/api/report/', reportsUpdatingDifferentTests[0], function (response) {
+ assert.equal(response.statusCode, 200);
+ fetchTestConfig('test1', 'Time', function (originalConfig) {
+ postJSON('/api/report/', reportsUpdatingDifferentTests[2], function (response) {
+ assert.equal(response.statusCode, 200);
+ fetchTestConfig('test1', 'Time', function (config) {
+ assert.notEqual(+originalConfig['config_runs_last_modified'], +config['config_runs_last_modified']);
+ notifyDone();
+ });
+ });
+ });
+ });
+ });
+ });
+
+ it("should update the last modified date of unrelated test configurations", function () {
+ addBuilder(reportsUpdatingDifferentTests[0], function () {
+ postJSON('/api/report/', reportsUpdatingDifferentTests[0], function (response) {
+ assert.equal(response.statusCode, 200);
+ fetchTestConfig('test1', 'Time', function (originalConfig) {
+ postJSON('/api/report/', reportsUpdatingDifferentTests[1], function (response) {
+ assert.equal(response.statusCode, 200);
+ fetchTestConfig('test1', 'Time', function (config) {
+ assert.equal(+originalConfig['config_runs_last_modified'], +config['config_runs_last_modified']);
+ notifyDone();
+ });
+ });
+ });
+ });
+ });
+ });
});