Skip to content

Commit

Permalink
Merge branch 'master' into feature/dev-tools
Browse files Browse the repository at this point in the history
  • Loading branch information
lukasolson committed Aug 24, 2016
2 parents 08fcb97 + 19a8738 commit 44990e5
Show file tree
Hide file tree
Showing 140 changed files with 1,844 additions and 7,003 deletions.
24 changes: 20 additions & 4 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,10 @@ A high level overview of our contributing guidelines.
- [Testing and Building](#testing-and-building)
- [Debugging Unit Tests](#debugging-unit-tests)
- [Unit Testing Plugins](#unit-testing-plugins)
- [Running Browser Automation Tests](#running-browser-automation-tests)
- [Browser Automation Notes](#browser-automation-notes)
- [Cross-browser compatibility](#cross-browser-compatibility)
- [Testing compatibility locally](#testing-compatibility-locally)
- [Running Browser Automation Tests](#running-browser-automation-tests)
- [Browser Automation Notes](#browser-automation-notes)
- [Building OS packages](#building-os-packages)
- [Signing the contributor license agreement](#signing-the-contributor-license-agreement)
- [Submitting a Pull Request](#submitting-a-pull-request)
Expand Down Expand Up @@ -219,7 +221,21 @@ To run the tests for just your particular plugin, assuming you plugin lives outs
npm run test:dev -- --kbnServer.testsBundle.pluginId=some_special_plugin --kbnServer.plugin-path=../some_special_plugin
```

### Running Browser Automation Tests
### Cross-browser Compatibility

#### Testing Compatibility Locally

##### Testing IE on OS X

* [Download VMWare Fusion](http://www.vmware.com/products/fusion/fusion-evaluation.html).
* [Download IE virtual machines](https://developer.microsoft.com/en-us/microsoft-edge/tools/vms/#downloads) for VMWare.
* Open VMWare and go to Window > Virtual Machine Library. Unzip the virtual machine and drag the .vmx file into your Virtual Machine Library.
* Right-click on the virtual machine you just added to your library and select "Snapshots...", and then click the "Take" button in the modal that opens. You can roll back to this snapshot when the VM expires in 90 days.
* In System Preferences > Sharing, change your computer name to be something simple, e.g. "computer".
* Run Kibana with `npm start -- --no-ssl --host=computer.local` (subtituting your computer name).
* Now you can run your VM, open the browser, and navigate to `http://computer.local:5601` to test Kibana.

#### Running Browser Automation Tests

The following will start Kibana, Elasticsearch and the chromedriver for you. To run the functional UI tests use the following commands

Expand All @@ -242,7 +258,7 @@ To execute the front-end browser tests, enter the following. This requires the s
npm run test:ui:runner
```

#### Browser Automation Notes
##### Browser Automation Notes

- Using Page Objects pattern (https://theintern.github.io/intern/#writing-functional-test)
- At least the initial tests for the Settings, Discover, and Visualize tabs all depend on a very specific set of logstash-type data (generated with makelogs). Since that is a static set of data, all the Discover and Visualize tests use a specific Absolute time range. This guarantees the same results each run.
Expand Down
4 changes: 4 additions & 0 deletions config/kibana.yml
Original file line number Diff line number Diff line change
Expand Up @@ -65,6 +65,10 @@
# headers, set this value to [] (an empty list).
# elasticsearch.requestHeadersWhitelist: [ authorization ]

# Header names and values that are sent to Elasticsearch. Any custom headers cannot be overwritten
# by client-side headers, regardless of the elasticsearch.requestHeadersWhitelist configuration.
# elasticsearch.customHeaders: {}

# Time in milliseconds for Elasticsearch to wait for responses from shards. Set to 0 to disable.
# elasticsearch.shardTimeout: 0

Expand Down
2 changes: 2 additions & 0 deletions docs/kibana-yml.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,8 @@ wait for Elasticsearch to respond to pings.
Elasticsearch. This value must be a positive integer.
`elasticsearch.requestHeadersWhitelist:`:: *Default: `[ 'authorization' ]`* List of Kibana client-side headers to send to Elasticsearch.
To send *no* client-side headers, set this value to [] (an empty list).
`elasticsearch.customHeaders:`:: *Default: `{}`* Header names and values to send to Elasticsearch. Any custom headers
cannot be overwritten by client-side headers, regardless of the `elasticsearch.requestHeadersWhitelist` configuration.
`elasticsearch.shardTimeout:`:: *Default: 0* Time in milliseconds for Elasticsearch to wait for responses from shards. Set
to 0 to disable.
`elasticsearch.startupTimeout:`:: *Default: 5000* Time in milliseconds to wait for Elasticsearch at Kibana startup before
Expand Down
4 changes: 2 additions & 2 deletions docs/production.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -345,10 +345,10 @@ name of your cluster.
--------
cluster.name: "my_cluster"
--------
. Make sure Kibana is configured to point to your local client node. In `kibana.yml`, the `elasticsearch_url` should be set to
. Make sure Kibana is configured to point to your local client node. In `kibana.yml`, the `elasticsearch.url` should be set to
`localhost:9200`.
+
--------
# The Elasticsearch instance to use for all your queries.
elasticsearch_url: "http://localhost:9200"
elasticsearch.url: "http://localhost:9200"
--------
4 changes: 2 additions & 2 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -100,8 +100,8 @@
"csv-parse": "1.1.0",
"d3": "3.5.6",
"dragula": "3.7.0",
"elasticsearch": "12.0.0-rc4",
"elasticsearch-browser": "12.0.0-rc4",
"elasticsearch": "12.0.0-rc5",
"elasticsearch-browser": "12.0.0-rc5",
"even-better": "7.0.2",
"expiry-js": "0.1.7",
"exports-loader": "0.6.2",
Expand Down
11 changes: 8 additions & 3 deletions src/cli/cluster/base_path_proxy.js
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ import { readFileSync } from 'fs';
import Config from '../../server/config/config';
import setupConnection from '../../server/http/setup_connection';
import setupLogging from '../../server/logging';
import { DEV_SSL_CERT_PATH } from '../dev_ssl';

const alphabet = 'abcdefghijklmnopqrztuvwxyz'.split('');

Expand All @@ -24,9 +25,13 @@ export default class BasePathProxy {

const { cert } = config.get('server.ssl');
if (cert) {
this.proxyAgent = new HttpsAgent({
ca: readFileSync(cert)
});
const httpsAgentConfig = {};
if (cert === DEV_SSL_CERT_PATH && config.get('server.host') !== 'localhost') {
httpsAgentConfig.rejectUnauthorized = false;
} else {
httpsAgentConfig.ca = readFileSync(cert);
}
this.proxyAgent = new HttpsAgent(httpsAgentConfig);
}

if (!this.basePath) {
Expand Down
3 changes: 3 additions & 0 deletions src/cli/dev_ssl.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
import { resolve } from 'path';
export const DEV_SSL_CERT_PATH = resolve(__dirname, '../../test/dev_certs/server.crt');
export const DEV_SSL_KEY_PATH = resolve(__dirname, '../../test/dev_certs/server.key');
6 changes: 4 additions & 2 deletions src/cli/serve/serve.js
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,8 @@ import { fromRoot } from '../../utils';
import { getConfig } from '../../server/path';
import readYamlConfig from './read_yaml_config';

import { DEV_SSL_CERT_PATH, DEV_SSL_KEY_PATH } from '../dev_ssl';

let canCluster;
try {
require.resolve('../cluster/cluster_manager');
Expand Down Expand Up @@ -38,8 +40,8 @@ function readServerSettings(opts, extraCliOptions) {
set('optimize.lazy', true);
if (opts.ssl && !has('server.ssl.cert') && !has('server.ssl.key')) {
set('server.host', 'localhost');
set('server.ssl.cert', fromRoot('test/dev_certs/server.crt'));
set('server.ssl.key', fromRoot('test/dev_certs/server.key'));
set('server.ssl.cert', DEV_SSL_CERT_PATH);
set('server.ssl.key', DEV_SSL_KEY_PATH);
}
}

Expand Down
Binary file not shown.
62 changes: 62 additions & 0 deletions src/cli_plugin/install/__tests__/zip.js
Original file line number Diff line number Diff line change
Expand Up @@ -80,6 +80,68 @@ describe('kibana cli', function () {

describe('extractFiles', function () {

describe('strip files parameter', function () {

it('strips specified number of directories', function () {

return copyReplyFile('strip_test.zip')
.then(() => {
return extractFiles(settings.tempArchiveFile, settings.workingPath, 1);
})
.then(() => {
const files = glob.sync('**/*', { cwd: testWorkingPath });
const expected = [
'1 level deep.txt',
'test-plugin',
'test-plugin/2 levels deep.txt',
'test-plugin/public',
'test-plugin/public/3 levels deep.txt',
'archive.part'
];
expect(files.sort()).to.eql(expected.sort());
});

});

it('throws an exception if it tries to strip too many directories', function () {

return copyReplyFile('strip_test.zip')
.then(() => {
return extractFiles(settings.tempArchiveFile, settings.workingPath, 2);
})
.then(shouldReject, (err) => {
expect(err.message).to.match(/You cannot strip more levels than there are directories/i);
});

});

it('applies the filter before applying the strip directories logic', function () {

return copyReplyFile('strip_test.zip')
.then(() => {
const filter = {
paths: [
'test-plugin'
]
};

return extractFiles(settings.tempArchiveFile, settings.workingPath, 2, filter);
})
.then(() => {
const files = glob.sync('**/*', { cwd: testWorkingPath });
const expected = [
'2 levels deep.txt',
'public',
'public/3 levels deep.txt',
'archive.part'
];
expect(files.sort()).to.eql(expected.sort());
});

});

});

it('extracts files using the files filter', function () {
return copyReplyFile('test_plugin_many.zip')
.then(() => {
Expand Down
12 changes: 8 additions & 4 deletions src/cli_plugin/install/zip.js
Original file line number Diff line number Diff line change
Expand Up @@ -63,11 +63,15 @@ export async function extractFiles(zipPath, targetPath, strip, filter) {

unzipper.on('error', reject);

unzipper.extract({
const options = {
path: targetPath,
strip: strip,
filter: extractFilter(filter)
});
strip: strip
};
if (filter) {
options.filter = extractFilter(filter);
}

unzipper.extract(options);

unzipper.on('extract', resolve);
});
Expand Down
56 changes: 0 additions & 56 deletions src/core_plugins/console/api_server/es_1_0.js

This file was deleted.

Loading

0 comments on commit 44990e5

Please sign in to comment.