搭建discuz模板和织梦程序如何搭建网站模板有什么区别

Mark as Complete
Sign up for our newsletter.
Get the latest tutorials on SysAdmin and open source topics.
Thanks for signing up!
Centralized Logging with Logstash and Kibana On CentOS 7
Tutorial Series
Choose a Series to access other tutorials related to this one.
How To Use Kibana Dashboards and Visualizations
Subscribed
362.6k views
Introduction
Kibana 4 is an analytics and visualization platform that builds on Elasticsearch to give you a better understanding of your data. In this tutorial, we will get you started with Kibana, by showing you how to use its interface to filter and visualize log messages gathered by an Elasticsearch ELK stack. We will cover the main interface components, and demonstrate how to create searches, visualizations, and dashboards.
Prerequisites
This tutorial is the third part in the Centralized Logging with Logstash and Kibana series.
It assumes that you have a working ELK setup. The examples assume that you are gathering syslog and Nginx access logs. If you are not gathering these types of logs, you should be able to modify the demonstrations to work with your own log messages.
If you want to follow this tutorial exactly as presented, you should have the following setup, by following the first two tutorials in this series:
An ELK Stack gathering syslogs:
Nginx access logs and filters:
When you are ready to move on, let's look at an overview of the Kibana interface.
Kibana Interface Overview
The Kibana interface is divided into four main sections:
We will go over the basics of each section, in the listed order, and demonstrate how each piece of the interface can be used.
Kibana Discover
When you first connect to Kibana 4, you will be taken to the Discover page. By default, this page will display all of your ELK stack's most recently received logs.
Here, you can filter through and find specific log messages based on Search Queries, then narrow the search results to a specific time range with the Time Filter.
Here is a breakdown of the Kibana Discover interface elements:
Search Bar: Directly under the main navigation menu. Use this to search specific fields and/or entire messages
Time Filter: Top-right (clock icon). Use this to filter logs based on various relative and absolute time ranges
Field Selector: Left, under the search bar. Select fields to modify which ones are displayed in the Log View
Date Histogram: Bar graph under the search bar. By default, this shows the count of all logs, versus time (x-axis), matched by the search and time filter. You can click on bars, or click-and-drag, to narrow the time filter
Log View: Bottom-right. Use this to look at individual log messages, and display log data filtered by fields. If no fields are selected, entire log messages are displayed
This animation demonstrates a few of the main features of the Discover page:
Here is a step-by-step description of what is being performed:
Selected the "type" field, which limits what is displayed for each log record (bottom-right)—by default, the entire log message is displayed
Searched for type: "nginx-access", which only matches Nginx access logs
Expanded the most recent Nginx access log to look at it in more detail
Note that the results are being limited to the "Last 15 minutes". If you are not getting any results, be sure that there were logs, that match your search query, generated in the time period specified.
The log messages that are gathered and filtered are dependent on your Logstash and Logstash Forwarder configurations. In our example, we are gathering the syslog and Nginx access logs, and filtering them by "type". If you are gathering log messages but not filtering the data into distinct fields, querying against them will be more difficult as you will be unable to query specific fields.
The search provides an easy and powerful way to select a specific subset of log messages. The search syntax is pretty self-explanatory, and allows boolean operators, wildcards, and field filtering. For example, if you want to find Nginx access logs that were generated by Google Chrome users, you can search for type: "nginx-access" AND agent: "chrome". You could also search by specific hosts or client IP address ranges, or any other data that is contained in your logs.
When you have created a search query that you want to keep, you can do that by clicking the Save Search icon then the Save button, like in this animation:
Saved searches can be opened at any time by clicking the Load Saved Search icon, and they can also be used when creating visualizations.
We will save the type: "nginx-access" search as "type nginx access", and use it to create a visualization.
Kibana Visualize
The Kibana Visualize page is where you can create, modify, and view your own custom visualizations. There are several different types of visualizations, ranging from Vertical bar and Pie charts to Tile maps (for displaying data on a map) and Data tables. Visualizations can also be shared with other users who have access to your Kibana instance.
If this is your first time using Kibana visualizations, you must reload your field list before proceeding. Instructions to do this are covered in the Reload Field Data subsection, under the
Create Vertical Bar Chart
To create a visualization, first, click the Visualize menu item.
Decide which type of visualization you want, and select it. We will create a Vertical bar chart, which is a good starting point.
Now you must select a search source. You may either create a new search or use a saved search. We will go with the latter method, and select the type nginx access search that we created earlier.
At first, the preview graph, on the right side, will be a solid bar (assuming that your search found log messages) because it consists only of a Y-axis of "Count". That is, it is simply displaying the number of logs that were found with the specified search query.
To make the visualization more useful, let's add some new buckets to it.
First, add an X-axis bucket, then click the Aggregation drop-down menu and select "Date Histogram". If you click the Apply button, the single bar will split into several bars along the X-axis. Now the Count is displayed as multiple bars, divided into intervals of time (which can be modified by selecting an interval from the drop-down)—similar to what you would see on the Discover page.
If we want to make the graph a little more interesting, we can click the Add Sub Aggregation button. Select the Split Bars bucket type. Click the Sub Aggregation drop-down menu and select "Significant Terms", then click the Field drop-down menu and select "clientip.raw", then click the Size field and enter "10". Click the Apply button to create the new graph.
Here is a screenshot of what you should see at this point:
If the logs being visualized were generated by multiple IP addresses (i.e. more than one person is accessing your site), you will see that each bar will be divided into colored segments. Each colored segment represents the Count of logs generated by a specific IP address (i.e. a particular visitor to your site), and the graph will show the up to 10 different segments (because of the Size setting). You can mouseover and click any of the items in the graph to drill down to specific log messages.
When you are ready to save your visualization, click the Save Visualization icon, near the top, then name it and click the Save button.
Create Another Visualization
Before continuing to the next section, where we will demonstrate how to create a dashboard, you should create at least one more visualization. Try and explore the various visualization types.
For example, you could create a pie chart of your top 5 (highest count) log "types". To do this, click Visualize then select Pie chart. Then use a new search, and leave the search as "" (i.e. all of your logs). Then select *Split Slices** bucket. Click the Aggregation drop-down and select "Significant Terms", click the Field drop-down and select "type.raw", then click the Size field and enter "5". Now click the Apply button and save the visualization as "Top 5".
Here is a screenshot of the settings that were just described:
Because, in our example, we're only collecting syslogs and Nginx access logs, there will only be two slices in the pie chart.
Once you are done creating visualizations, let's move on to creating a Kibana dashboard.
Kibana Dashboard
The Kibana Dashboard page is where you can create, modify, and view your own custom dashboards. With a dashboard, you can combine multiple visualizations onto a single page, then filter them by providing a search query or by selecting filters by clicking elements in the visualization. Dashboards are useful for when you want to get an overview of your logs, and make correlations among various visualizations and logs.
Create Dashboard
To create a Kibana dashboard, first, click the Dashboard menu item.
If you haven't created a dashboard before, you will see a mostly blank page that says "Ready to get started?". If you don't see this screen (i.e. there are already visualizations on the dashboard), press the New Dashboard icon (to the right of the search bar) to get there.
This animation demonstrates how to can add visualizations to your dashboard:
Here is a breakdown of the steps that are being performed:
Clicked Add Visualization icon
Added "Log Counts" pie chart and "Nginx: Top 10 client IP" histogram
Collapsed the Add Visualization menu
Rearranged and resized the visualizations on the dashboard
Clicked Save Dashboard icon
Choose a name for your dashboard before saving it.
This should give you a good idea of how to create a dashboard. Go ahead and create any dashboards that you think you might want. We'll cover using dashboards next.
Use Dashboard
Dashboards can be filtered further by entering a search query, changing the time filter, or clicking on the elements within the visualization.
For example, if you click on a particular color segment in the histogram, Kibana will allow you to filter on the significant term that the segment represents. Here is an example screenshot of applying a filter to a dashboard:
Be sure to click the Apply Now button to filter the results, and redraw the dashboard's visualizations. Filters can be applied and removed as needed.
The search and time filters work just like they do in the Discover page, except they are only applied to the data subsets that are presented in the dashboard.
Kibana Settings
The Kibana Settings page lets you change a variety of things like default values or index patterns. In this tutorial, we will keep it simple and focus on the Indices and Objects sections.
Reload Field Data
When you add new fields to your Logstash data, e.g. if you add a filter for a new log type, you may need to reload your field list. It is necessary to reload the field list if you are unable find filtered fields in Kibana, as this data is only cached periodically.
To do so, click the Settings menu item, then click "logstash-*" (under Index Patterns):
Then click the yellow Reload Field List button. Hit the OK button to confirm.
Edit Saved Objects
The Objects section allows you to edit, view, and delete any of your saved dashboards, searches, and visualizations.
To get there, click on the Settings menu item, then the Objects sub-menu.
Here, you can select from the tabs to find the objects that you want to edit, view, or delete:
In the screenshot, we have selected a duplicate visualization. It can be edited, viewed, or deleted by clicking on the appropriate button.
Conclusion
If you followed this tutorial, you should have a good understanding of how to use Kibana 4. You should know how to search your log messages, and create visualizations and dashboards.
Be sure to check out the next tutorial in this series,
If you have any questions or suggestions, please leave a comment!
Subscribed
Centralized logging can be very useful when attempting to identify problems with your servers or applications, as it allows you to search through all of your logs in a single place. It is also useful because it allows you to identify issues that span multiple servers by correlating their logs during a specific time frame.
This series will teach you how to install Logstash and Kibana on Ubuntu, then how to add more filters to structure your log data. Then it will teach you how to use Kibana.
In this tutorial, we will go over the installation of the Elasticsearch ELK Stack on Ubuntu 14.04—that is, Elasticsearch 2.2.x, Logstash 2.2.x, and Kibana 4.4.x. We will also show you how to configure it to gather and visualize the syslogs of your systems in a centralized location, using Filebeat 1.0.x. Logstash is an open source tool for collecting, parsing, and storing logs for future use. Kibana 4 is a web interface that can be used to search and view the logs that Logstash has indexed.
In this tutorial, we will show you how to use Topbeat, on an Ubuntu 14.04 server, with an ELK stack to gather and visualize infrastructure metrics. Topbeat, which is one of the several "Beats" data shippers that helps send various types of server data to an Elasticsearch instance, allows you to gather information about the CPU, memory, and process activity on your servers.
One way to increase the effectiveness of your Logstash setup is to collect important application logs and structure the log data by employing filters.
This guide is a sequel to the [How To Use Logstash and Kibana To Centralize Logs On Ubuntu 14.04](/community/tutorials/how-to-use-logstash-and-kibana-to-centralize-and-visualize-logs-on-ubuntu-14-04) tutorial, and focuses primarily on adding filters for various common application log.
The Kibana interface is divided into four sections: Discover Visualize Dashboard Settings We will go over the basics of each section, and demonstrate how each section can be used.
IP Geolocation, the process used to determine the physical location of an IP address, can be leveraged for a variety of purposes, such as content personalization and traffic analysis. Traffic analysis by geolocation can provide invaluable insight into your user base as it...
Centralized logging can be very useful when attempting to identify problems with your servers or applications, as it allows you to search through all of your logs in a single place. It is also useful because it allows you to identify issues that span multiple servers by correlating their logs during a specific time frame. This series will teach you how to install Logstash and Kibana on Ubuntu, then how to add more filters to structure your log data. Then it will teach you how to use Kibana.
In this tutorial, we will go over the installation of the Elasticsearch ELK Stack on CentOS 7—that is, Elasticsearch 2.1.x, Logstash 2.1.x, and Kibana 4.3.x. We will also show you how to configure it to gather and visualize the syslogs of your systems in a centralized location, using Filebeat 1.0.x. Logstash is an open source tool for collecting, parsing, and storing logs for future use. Kibana 4 is a web interface that can be used to search and view the logs that Logstash has indexed.
In this tutorial, we will show you how to use Topbeat, on a CentOS 7 server, with an ELK stack to gather and visualize infrastructure metrics. Topbeat, which is one of the several "Beats" data shippers that helps send various types of server data to an Elasticsearch instance, allows you to gather information about the CPU, memory, and process activity on your servers.
One way to increase the effectiveness of your Logstash setup is to collect important application logs and structure the log data by employing filters.
This guide is a sequel to the [How To Use Logstash and Kibana To Centralize Logs On Ubuntu 14.04](/community/tutorials/how-to-use-logstash-and-kibana-to-centralize-and-visualize-logs-on-ubuntu-14-04) tutorial, and focuses primarily on adding filters for various common application log.
The Kibana interface is divided into four sections: Discover Visualize Dashboard Settings We will go over the basics of each section, and demonstrate how each section can be used.
IP Geolocation, the process used to determine the physical location of an IP address, can be leveraged for a variety of purposes, such as content personalization and traffic analysis. Traffic analysis by geolocation can provide invaluable insight into your user base as it...
Spin up an SSD cloud server in under a minute.
Simple setup. Full root access. Straightforward pricing.
Related Tutorials
This work is licensed under a .
Sign into your account, or create a new one, to start interacting.
Use this form to report bugs related to the Community
Report a bug:按公司简称
按股票代码
按企业行业
按发布日期
按关 键 词
按发布地区
按文章标题
&您的位置: >& 新闻浏览
Elastic公司 Elastic Stack 下载量高达1亿次&&
16:46&&&& 来源:商业电讯
&&&&美国旧金山市&()--(Elastic{ON}17) -- 开放资源 Elastic Stack (包括 Elasticsearch、 Kibana、Logstash 和 Beats)、X-Pack 商业扩展及 Elastic Cloud的支持公司 Elastic 宣布其下载量已经超过 1 亿次,并将发布一系列新产品。从前年的 2,000 万次下载量到去年的 5,000 万次下载量,这一里程碑将 Elastic 确立为最成功的开源企业软件提供商之一。Elastic 产品解决并嵌入在关键任务的用例中,如应用程序搜索、日志记录、安全分析、(数据)量度分析、操作分析,以及用于构建实时、可扩展的数据应用程序。 &&&&Elastic 联合创始人兼首席技术官 Shay Banon 表示:&我们从一开始便开发了能解决复杂问题的产品,同时使用户能够轻松地开始使用,并在每次使用我们的软件时进行扩展。&&这 1 亿次下载量的里程碑极好地证明了我们精彩的用户社区、客户和合作伙伴,是他们在继续创新我们的产品、发现新的用例,并构建分布式系统。& &&&&在其第三届年度用户大会 Elastic{ON}17上,Elastic 宣布了以下新产品和功能,并计划于 2017 年春季发布: &&&&Filebeat Modules:Elastic 的 Filebeat Modules 创建了一种简单的、开箱即用的体验,用以收集、解析并将来自系统和服务(如 Apache2、Nginx、 MySQL 等)的日志数据可视化。安装完成后,Filebeat Modules 将数据发送到 Elasticsearch,并且预配置模板将设置 Kibana 仪表板,为用户在几分钟内分析数据。 &&&&Time Series Visual Builder:扩展了 Timelion 和 Metricbeat 的功能,Time Series Visual Builder 展示了同一可视化中的整合力量。作为 Kibana 的一个新功能,用户可以为数字和时间序列数据(如日志量度、应用程序统计信息和服务器统计数据)创建功能强大的自定义可视化(直方图、计量标准、度量、需求分析及减价)。 &&&&Machine Learning:Elastic 的最新无人机学习功能可根据预测模型自动识别存储于 Elasticsearch 中的数据中的异常现象和极端值。借助 Elastic Stack 中内置的机器学习,对任何类型、大小和复杂性的数据进行自动化和统计分析的能力将使用户能够采取主动行动来进行欺诈检测、网络安全维护等。 &&&&Elastic Cloud Enterprise:基于 Elastic Cloud、Elastic 的托管产品 Elasticsearch 和 Kibana, Elastic Cloud Enterprise 通过单个控制台自动管理多个 Elastic Stack 环境的设置、管理和操作。Elastic Cloud Enterprise 提供了在虚拟设置或公共云中运行预置的选择。它包括 Elastic Stack 和 X-Pack 的所有最新功能;并允许用户在其整个组织中提供集中式服务,例如日志记录、监控和搜索服务功能。 &&&&Elastic{ON}17 还宣布了未来产品,这些产品将对 Elastic Stack 进行扩展,以适应新型用户、数据类型及可视化: &&&&Elasticsearch SQL:通过揭露多数用户熟悉的语言,并使用 Elasticsearch 独有的功能对其进行扩展,Elasticsearch SQL 为新用户和新类型交互打开了大门。 &&&&Kibana Canvas:Kibana Canvas 允许用户使用新的表达方式进行数据交互,以及使用 Elasticsearch 中的实时数据创建并组合图表、仪表板、演示文稿和信息图。 &&&&了解更多 &&&&Elastic Stack &&&&X-Pack &&&&Elastic Cloud &&&&Getting Started with the Elastic Stack &&&&关于 Elastic &&&&Elastic 是世界领先的软件提供商,致力于将结构化和非结构化数据实时用于搜索、日志记录和分析等用例。由 Elasticsearch、Kibana、Beats 和 Logstash 开源项目的支持者 2012 年成立,Elastic Stack、X-Pack 和 Elastic Cloud 迄今累积下载量已超过 1 亿次。Elastic 由 Benchmark Capital、Index Ventures 和总部设在 California 的 Amsterdam 和 Mountain View 的 NEA 以及世界各地的办事处和员工提供支持。了解更多信息,请访问 www.elastic.co。 &&&&Elastic 媒体联络: &&&&北美洲 &&&&Michael Lindenberger &&&&Reidy Communications for Elastic &&&& &&&&+1-415-531-1449 &&&&欧洲、中东和非洲 &&&&Rory MacDonald &&&&Age of Peers Ltd for Elastic &&&& &&&&+44 (0)
北京市海淀区学院南路15号北发大厦B座1层  邮编:100088 电话:010-0 传真:010-
商业电讯网 Business Press Release Newswire
版权所有 北京商讯天下科技有限公司

我要回帖

更多关于 织梦搭建 的文章

 

随机推荐