How to Query Extended Events Target XML

SQL Server

When Extended Events (XE) were released with SQL Server 2008, I was excited – something new to learn! I read Microsoft articles about them, read blog posts about them, and tried to use them. At that time, there was no GUI for XE. I had to create the sessions with T-SQL. That wasn’t awful – I learned the syntax after some trial and error, and could start and stop sessions. There was also no GUI to view the saved data – and it was all saved as XML.

I ran headfirst into a brick wall.
This is the wall I ran into.

I gave up when faced with XML. When SQL Server 2008R2 came out, I once again looked at XE. With no GUI for creating or viewing data, I retreated in defeat again. With SQL Server 2012 came the GUI – a new session wizard, a new session dialog box, the (not-recommended) ability to view live data, and the ability to see the information you collected. But that didn’t solve the root problem for me – all the good data, all the results I wanted to view, aggregate, and slice and dice, were still in XML.

So, I learned to query XML with T-SQL. It hasn’t been easy for me to understand. But, by breaking it down into small steps I was able to conquer it. If querying XML has been the reason you’ve been avoiding XE, I’m here to help you get past it!

Create a session

I’m going to create a session that collects wait statistics and puts them in the Ring Buffer target, then run a small workload to capture waits.

I used the default query from Books Online to view the data. This is not helpful!

ring buffer default query

Even if I click the target_data column (which I can do!), I just get a page of XML I have to read through. Yuck!

ring buffer XML

Querying XML

So, how do I make this data understandable, so I can count the number of each wait type that happened? We want to get the XML into a format we can use. I’m going to start by inserting the data into a temp table and selecting it.

This gives us the same link to a page of XML. What we can do now that that data is in a temp table, though, is “shred” the XML. XML is just nodes of data. You only need to understand the hierarchy of the nodes to query it.

Let’s look at this XML. I can see there is one element, event name=”wait_info” (highlighted in green) that I want to dig into. Under that, I want to get the “wait_type”, “duration”, and “signal_duration” (highlighted in yellow) attributes.

xml nodes i want to see

To do so, I use the T-SQL XML value function to pull out wait_type, like so.

Once I understand how to use the value function, I can do the same for other data I want to see.

I can read this!
I can read this!

Level up: Aggregating data

For this XE session (and wait statistics in general), having every wait listed out isn’t helpful. I want to aggregate the data so I can see which wait occurred the most frequently. What happens when I try to aggregate this query?

I get an error.
I get an error.

I have to take a different approach. I’m going to use a derived table. I will put my initial query to select the data from the temp table into the FROM clause, then aggregate on the results.

I have usable data!

This is one sleepy system.
This is one sleepy system.

Last but not least, I’ll clean up after myself.

Querying the XML output from Extended Events sessions is easy if you take it step-by-step. When you are looking at the raw XML, establish what elements you want to extract. (These will vary by the events you choose to capture.) Move the data into a temp table you can work with. Use the XML value function to extract one element at a time, until you have the data you need. At last, if you need or want to, you can aggregate.

Don’t let XML be the reason you won’t use Extended Events – it isn’t for me any longer!

Want to learn more about Extended Events? Start here! 

Previous Post
Woohoo, We Won Two 2014 Tribal Awards!
Next Post
Urgent AlwaysOn Availability Groups Bug

22 Comments. Leave new

  • The largest problem I have with Extended Events is performance. Dumping trace data into a table was a straight forward process. Even if you had multiple TRC files, SQL Profiler would stitch them together for you.

    With extended events, if you open the file natively, there is an option to export to table but it will not chain them together, forcing me to open each file manually.

    If I try to do this through SQL, if the XEL files are of any significant size and quantity, shredding the XML is a laborious process.

    Overall I like Extended Events but issues like this keep me from using them over server side traces.

    If you have found a better way of parsing multiple XEL files, please let me know! I’d love to hear how you did it.

    • Jes Schultz Borland
      January 26, 2015 8:50 am

      Are you using sys.fn_xe_file_target_read_file to read all XEL files from a directory?

      I don’t have any great answers to your question yet, but you’ve definitely given me something to research!

  • Rafael Dontal
    January 5, 2016 10:53 am

    Hi Jes,

    Do you have any advice to query a large event file? I have got a 600MB file and I am in trouble to query that. My query ran for 2h and I had to stop that.

    Thanks a million.

    • Rafael – that’s not really a good use for Extended Events. The idea of Extended Events is that you gather just the information you need, and no extra. Do filtering when you’re capturing a trace so that you only get the information you need.

      • Rafael Dontal
        January 5, 2016 11:13 am

        Really honored being replied by you, Brent. Thanks for that.

        By the way, we are doing loads of test with different scenarios and we would like to build a consistent benchmark of each one. I am monitoring a group of procs that I want avg duration, reads and writes. It turned out to be impossible using XE. I would say that profile will have a huge impact on performance. Saving the date of start and finish of each execution would affect the procs performance as well.

        Do you have any idea what would be the best option to get that needed information for a long period?

        • franklin yamamoto
          November 21, 2017 4:11 pm

          If you are looking at proc performance monitor them with sys.dm_exec_procedure_stats – these stats are cumulative since the proc was cached though. You might want to poll the stats via a sql job and store the stats in a table – you’ll need to take a delta between each polled interval. Use the sql_handles to only track the procs you are interested in. Your other option is to purchase an APM tool.

  • Nice overview of how cumbersome it is to scrap information out of an XML. Good info for SQL2008 die hards and a stark warning for the rest of us. One of the better examples I’ve seen on the web.

  • SSMS Menu: File > Open > Merge Extended Events Files.
    Select several .XEL files to append together in the SSMS display.

    Then, with focus on the SSMS pane that is displaying the event file data…

    Menu: Extended Events > Export to > Table.
    – – – – –

    Maybe the above steps would address @Clayton Hoyt scenario?

    • Wow, opening it in SSMS is easier than wrestling with XML, at least for quick and easy diagnoses.

    • This option through SSMS saved my night!! Question now is how to make it repeatable and automatic? Dealing with the sys.fn_xe_file_target_read_file function and xml syntax like xdata.value(‘(/event/action[@name=”database_name”]/value)[1]’,’varchar(100)’) AS [Database] ..etc… works fine for small files but for larger files it is a complete nightmare.

  • Is there a way to read xevents with Powershell?

    • I pasted your question into Google and came up with this link:

      • Thank you for the reply. However that doesnt answer my question. Likewise the PS provider examples only demonstrate how to manage a package not how to get the data out for analysis. Id like to create a script that can be used for analysis of the data for predictive failure and or performance related issues before they occur.

        • Oh, good luck with that. PowerShell sounds like a terrible tool for the job.

          But if you want some direction, you’re either going to be issuing XQuery to parse event data on the SQL Server where the session is active (which is just adding an extra layer of complexity on top of XQuery, YMMV), or pulling .xel file contents and using XQuery within PowerShell to do it. You’ll probably run into trouble doing that if the file is currently active in the session (SQL locks it down, like mdfs/ldfs/error log files), but you may be able to make a copy and then import.

          You may have better luck asking people who actively use PowerShell, like the SQL Community channel on Slack.

          Have a great weekend!

        • You can actually use the .NET assembly to read XE in PowerShell. Hit me up on Twitter if you want an example @m82labs. Or email me, my first name at

  • Jan Lundström
    March 13, 2019 3:37 am

    We’d like to use XE as security audit collection. Log any sql querys to the databases and extport it to a tamper free log review system. Problem is none of our systems (except SSMS) can read the files. How can you, in an automated way, parse the data, convert it to a common file format (csv, xlm, xls) and then truncate the xel files?

  • wait_type_duration_ms is a BIGINT and that might cause a problem ( xed.event_data.value(‘(data[@name=”duration”]/value)[1]’, ‘int’))

  • How can I change the location of file for extended event ?
    By default it reside at c:\ drive.

  • Mohammad Abdul Majeed
    November 26, 2019 10:47 pm

    Hi Ozar,

    The duration is in mille seconds or microseconds for blocking in extended events.

    blocked_process -Test–
    database_id 13
    database_name Test
    duration 888747000–The duration is in mille seconds or microseconds for blocking in extended events.
    index_id 256
    lock_mode X
    object_id 1102
    resource_owner_type LOCK
    transaction_id 621922007


Leave a Reply

Your email address will not be published. Required fields are marked *

Fill out this field
Fill out this field
Please enter a valid email address.