{"id":7053,"date":"2016-05-04T00:08:56","date_gmt":"2016-05-03T22:08:56","guid":{"rendered":"https:\/\/blog.redbaronofazure.com\/?p=7053"},"modified":"2016-05-28T12:39:57","modified_gmt":"2016-05-28T10:39:57","slug":"automating-provisioning-for-streamanalytics","status":"publish","type":"post","link":"https:\/\/blog.redbaronofazure.com\/?p=7053","title":{"rendered":"Automating provisioning for StreamAnalytics"},"content":{"rendered":"<p>StreamAnalytics is a key componant if you plan to do an analytics or IoT solution in Azure. It is a really powerful resource once you get it\u00a0configured. However auomating its deployment is a topic that isn&#8217;t covered alot and there are currently zero templates in the Azure quick start github repo and there are only a handful powershell examples out there. Since I spin up demo environments on Event Hub and StreamAnalytics quite often, I\u00a0was asked\u00a0to write about how I do it.<\/p>\n<p><strong>Event Hub Provisioning<\/strong><\/p>\n<p>StreamAnalytics usually works with the Event Hub and there is an excellent article by Paolo Salvatori that I put in the references that show you how to do it. If you read the comments in his article, you&#8217;ll see how you can add Authorization rules also. In the script I provide, I&#8217;ve made that modification. With that script, you can create an Event Hub resource that you use with StreamAnalytics in no time.<\/p>\n<p><strong>The format of StreamAnalytics provisioning<\/strong><\/p>\n<p>Working with the powershell cmdlets for StreamAnalytics takes a while to get use to, because the Get-AzureRMStreamAnalyticsJob and New-AzureRMStreamAnalyticsJob basically works with a json template, but it&#8217;s not really\u00a0a JSON Resource Template we know from ARM. Instead It is a custom format for StreamAnalytics. In an ideal world we could have exported the StreamAnalytics json config to file and then imported it when we need it, but it isn&#8217;t that simple. The JSON that is exported is stripped of sensitive information, like keys and passwords, and you need to provide that and update the json file before submitting it when you import the file. That&#8217;s not a simple task if you open several a JSON file with several hundred lines of code. To solve this, I wrote a script that would handle this task.<\/p>\n<p>The objectives of my provisioning script are 1) lookup the keys, etc, that are missing in the json config file and 2) enable to set new datasources so you can move a StreamAnalytics solution between environments, like from Dev\/Test\/Prod.<\/p>\n<p>What I don&#8217;t try to do is to change names of SQL tables, Storage Containers, etc, since that probably is part of the solution (you don&#8217;t change a SQL table name between test and prod).<\/p>\n<p>&nbsp;<\/p>\n<p><strong>Exporting a StreamAnalytics config<\/strong><\/p>\n<p>This is really as easy as 1-2-3. All you need to do is to invoke Get-AzureRMStreamAnalyticsJob and pipe th output to a file. The PropertiesInJson comes in handy as it holds exactly the value\u00a0of the config we can import later.<\/p>\n<p><a href=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-1A-export.png\"><img loading=\"lazy\" class=\"alignnone size-full wp-image-7073\" src=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-1A-export.png\" alt=\"StreamA-1A-export\" width=\"958\" height=\"559\" srcset=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-1A-export.png 958w, https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-1A-export-300x175.png 300w\" sizes=\"(max-width: 958px) 100vw, 958px\" \/><\/a><\/p>\n<p>However, I did take it one step further and generate a &#8220;datasources&#8221; JSON file too, much like the separate Parameters file you can have in the ARM templates. The idea is that the &#8220;datasources&#8221; JSON file is the one you should change when moving between environments.<\/p>\n<p><a href=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-1B-export.png\"><img loading=\"lazy\" class=\"alignnone size-full wp-image-7071\" src=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-1B-export.png\" alt=\"StreamA-1B-export\" width=\"1197\" height=\"132\" srcset=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-1B-export.png 1197w, https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-1B-export-300x33.png 300w, https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-1B-export-1024x113.png 1024w\" sizes=\"(max-width: 1197px) 100vw, 1197px\" \/><\/a><\/p>\n<p>The name of the file is the JobName with &#8220;-datasources&#8221; appended to it.\u00a0The format of this file is something I made up myself. It is possible to merge multiple Jobs datasource config into one single JSON file if you like to have it all in one place. The powershell script loads this file during import\/create and matches the datasources by name, so if you want to change a namespace or a database server for your StreamAnalytics job, you can easily do it by editing this file and leave the real config JSON file untouched.<\/p>\n<p><a href=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-1C-export.png\"><img loading=\"lazy\" class=\"alignnone size-full wp-image-7072\" src=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-1C-export.png\" alt=\"StreamA-1C-export\" width=\"935\" height=\"507\" srcset=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-1C-export.png 935w, https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-1C-export-300x163.png 300w\" sizes=\"(max-width: 935px) 100vw, 935px\" \/><\/a><\/p>\n<p><strong>The\u00a0format of the config\u00a0JSON file<\/strong><\/p>\n<p>The format of the JSON file that is exported and contains the config is publically documented in MSDN (see refs below).\u00a0It basically got four\u00a0major sections for a Job, which are<\/p>\n<p><a href=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-2-arch.png\"><img loading=\"lazy\" class=\"alignnone size-full wp-image-7056\" src=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-2-arch.png\" alt=\"StreamA-2-arch\" width=\"603\" height=\"488\" srcset=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-2-arch.png 603w, https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-2-arch-300x243.png 300w\" sizes=\"(max-width: 603px) 100vw, 603px\" \/><\/a><\/p>\n<ul>\n<li><strong>Job<\/strong> &#8211; the actual StreamAnalytics resource you create is called a Job. The Job is a container for everything else and just holds the name and\u00a0the location (Azure datacenter) where the job resource exists.<\/li>\n<li><strong>Inputs<\/strong> &#8211; A job have 1..N inputs and each input has a datasource and serialization configuration. Datasource config is account names,\u00a0keys, SQL Server and database names, etc. Serialization is about what dataformat the datasource works with, like CSV or JSON and if it&#8217;s UTF8, etc.<\/li>\n<li><strong>Outputs<\/strong> &#8211; A job also have 1..N outputs which is exactly as input with datasource and serialization configurations.\u00a0A job can have different number of inputs and outputs and a common scenario is to have 1 input and multiple outputs to split the data streaming through.<\/li>\n<li><strong>Transformation<\/strong> &#8211; Data streaming through is processed via a transformation step that looks pretty much like a SQL statement with a SELECT *\u00a0FROM inputs INTO outputs syntax. There is one and only one transformation step, so if you have multiple inputs the need to be joined in the SELECT step. This is also the place where you can invoke functions that exists in other places, like in Azure Machine Learning.<\/li>\n<\/ul>\n<p>The powershell script I have developed in this blog post is mainly focused on updating the DataSource configuration in the Inputs and Outputs step, since that is what varies inbetween environments.Ie, it tries to connect the arrows in the above figure.<\/p>\n<p><strong>Powershell script<\/strong><\/p>\n<p>The Powershell script can do multiple operations, like export the JSON file, start\/stop the StreamAnalytics Job and also create (import) or delete the job. Everything in the code is pretty simple except the create\/import which requires some logic. The Create\/Import logic is focused on fixing up the inputs\/outputs DataSources.\u00a0To do that, we load the datasources JSON file and pass that to a function that goes through each datasource in the inputs\/outputs of the job and updates the JSON config.<\/p>\n<p><a href=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-3A-main-logic.png\"><img loading=\"lazy\" class=\"alignnone size-full wp-image-7074\" src=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-3A-main-logic.png\" alt=\"StreamA-3A-main-logic\" width=\"1168\" height=\"1039\" srcset=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-3A-main-logic.png 1168w, https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-3A-main-logic-300x267.png 300w, https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-3A-main-logic-1024x911.png 1024w\" sizes=\"(max-width: 1168px) 100vw, 1168px\" \/><\/a><\/p>\n<p>Once the JSON values are\u00a0updated for the DataSources, the script saves it to a temp file and invoke New-AzureRmStreamAnalyticsJob cmdlet to create\/alter the resource in Azure. That sounds easy-piecy, but as you will see, supporting multiple datasources will make the script grow in lines of code.<\/p>\n<p><strong>Updating DataSources<\/strong><\/p>\n<p>The JSON config\u00a0contains a Type attribute for\u00a0each DataSource which is in the same format as the Resource Template, so the\u00a0job here is to handle\u00a0each different datasource separatly. The code starts with a lookup of the datasources by name from the load JSON file containing datasources configuration.<\/p>\n<p><a href=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-4A-datasource-logic.png\"><img loading=\"lazy\" class=\"alignnone size-full wp-image-7075\" src=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-4A-datasource-logic.png\" alt=\"StreamA-4A-datasource-logic\" width=\"981\" height=\"370\" srcset=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-4A-datasource-logic.png 981w, https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-4A-datasource-logic-300x113.png 300w\" sizes=\"(max-width: 981px) 100vw, 981px\" \/><\/a><\/p>\n<p>Stream Analytics currenty supports input from Event Hub, Storage and IoT Hub and of those I skipped IoT Hub for now. Output supports\u00a0a lot more and here I skipped Power BI, DocumentDB and a few more and just sticked to the basic ones.<\/p>\n<p>Since each DataSource is unique, it is handled by a different function. The Event Hub method is designed to set the namespace, EventHub Name and retrieve\u00a0the SharedAccessPolicyKey, which is the item not exported by Azure and is also the key item that your IoT device, like a Raspberry Pi, needs to be able to ingest data into the event hub.<\/p>\n<p><a href=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-5A1-datasource-eventhub.png\"><img loading=\"lazy\" class=\"alignnone size-full wp-image-7076\" src=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-5A1-datasource-eventhub.png\" alt=\"StreamA-5A1-datasource-eventhub\" width=\"1063\" height=\"234\" srcset=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-5A1-datasource-eventhub.png 1063w, https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-5A1-datasource-eventhub-300x66.png 300w, https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-5A1-datasource-eventhub-1024x225.png 1024w\" sizes=\"(max-width: 1063px) 100vw, 1063px\" \/><\/a><\/p>\n<p>The process of retrieving the SharedAccessPolicyKey is about using existing APIs for Service Bus and getting the value from Azure. The code checks in the dictionary variable if we already have this info and uses Get-AzureSBAuthorizationRule cmdlet to get the key if we don&#8217;t have it in the dictionary. (The akward powershell line below is a one liner that finds the right rule in an array and &#8220;digs out&#8221; the key.)<\/p>\n<p><a href=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-5B-datasource-eventhub.png\"><img loading=\"lazy\" class=\"alignnone size-full wp-image-7062\" src=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-5B-datasource-eventhub.png\" alt=\"StreamA-5B-datasource-eventhub\" width=\"1623\" height=\"329\" srcset=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-5B-datasource-eventhub.png 1623w, https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-5B-datasource-eventhub-300x61.png 300w, https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-5B-datasource-eventhub-1024x208.png 1024w\" sizes=\"(max-width: 1623px) 100vw, 1623px\" \/><\/a><\/p>\n<p>Updating the DataSource for a Service Bus Queue or Topic is\u00a0a little more complicated since you can have a SAS rule on the namespace level or on the queue\/topic level, so this function needs to do more lookups. I also made it possible that you can change queue\/topic names when switching environments, so that&#8217;s why you have a lot of if&#8217;s in the beginning.<\/p>\n<p>If the SAS key isn&#8217;t found on the namespace level, we need to go down on the Queue\/Topic level. Here we are leaving the yellow brick road of native Powershell cmdlets support and need to use CLR interop with Microsoft.ServiceBus.dll to use the Namespacemanager. This means\u00a0rewriting this to Azure CLI for Mac and\/or Linux will be a pain.<\/p>\n<p><a href=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-6A-datasource-queue.png\"><img loading=\"lazy\" class=\"alignnone size-full wp-image-7077\" src=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-6A-datasource-queue.png\" alt=\"StreamA-6A-datasource-queue\" width=\"1212\" height=\"976\" srcset=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-6A-datasource-queue.png 1212w, https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-6A-datasource-queue-300x242.png 300w, https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-6A-datasource-queue-1024x825.png 1024w\" sizes=\"(max-width: 1212px) 100vw, 1212px\" \/><\/a><\/p>\n<p>Handeling SQL Azure database as a datasource requires a different kind of complexity. You can&#8217;t query Azure in any way to get the password of the userid configured in the portal in any way. So the script can&#8217;t look it up which means it either needs to be in the JSON datasources file or we need to prompt for it.<\/p>\n<p><a href=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-7A-datasource-db.png\"><img loading=\"lazy\" class=\"alignnone size-full wp-image-7078\" src=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-7A-datasource-db.png\" alt=\"StreamA-7A-datasource-db\" width=\"980\" height=\"572\" srcset=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-7A-datasource-db.png 980w, https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-7A-datasource-db-300x175.png 300w\" sizes=\"(max-width: 980px) 100vw, 980px\" \/><\/a><\/p>\n<p><strong>Running the script<\/strong><\/p>\n<p>Running the script to create\/import the job assumes you have run it once before to export the job as a JSON file (which you&#8217;ll need in the create\/import phase). This means you design and develop the Stream Analytics Job in the portal and then use my script to export it to a JSON file (with my script). Once you have that you can recreate theStream Analytics job as easy as below.<\/p>\n<p><a href=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-8A-run-script.png\"><img loading=\"lazy\" class=\"alignnone size-full wp-image-7079\" src=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-8A-run-script.png\" alt=\"StreamA-8A-run-script\" width=\"1230\" height=\"233\" srcset=\"https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-8A-run-script.png 1230w, https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-8A-run-script-300x57.png 300w, https:\/\/blog.redbaronofazure.com\/wp-content\/uploads\/2016\/05\/StreamA-8A-run-script-1024x194.png 1024w\" sizes=\"(max-width: 1230px) 100vw, 1230px\" \/><\/a><\/p>\n<p><strong>Summary<\/strong><\/p>\n<p>Stream Analytics is powerfull and an integral part of Azure Analytics (and what is know being rebranded as Cortana Intellegence or the Intellegent Cloud). You will pick up pretty easily how to use it using the portal, but automating provisioning of it is a different story. Historically, Service Bus and Streaming Analytics aren&#8217;t best in class when it comes to Powershell and tooling support. It has become better and in this\u00a0blog post I&#8217;ve showed you something you can use as a DevOps guy.<\/p>\n<p>In comming posts I&#8217;ll tie it all together and show you a simulated IoT agent you can run on your Raspberry Pi (or Ubuntu Azure VM if you don&#8217;t have a Raspberry Pi) and also how you integrate Machine Learning. When you realize that you can provision all of this in just minutes, you will\u00a0start to see the power of the Public Cloud, such as Azure.<\/p>\n<p><strong>References<\/strong><\/p>\n<p>Creating en Event Hub namespace using PowerShell &#8211; Paolo Salvatori<br \/>\n<a href=\"https:\/\/blogs.msdn.microsoft.com\/paolos\/2014\/12\/01\/how-to-create-a-service-bus-namespace-and-an-event-hub-using-a-powershell-script\/\" target=\"_blank\">https:\/\/blogs.msdn.microsoft.com\/paolos\/2014\/12\/01\/how-to-create-a-service-bus-namespace-and-an-event-hub-using-a-powershell-script\/<\/a><\/p>\n<p>MSDN Documentation of StreamAnalytics JSON file<br \/>\n<a href=\"https:\/\/msdn.microsoft.com\/library\/dn834994.aspx\" target=\"_blank\">https:\/\/msdn.microsoft.com\/library\/dn834994.aspx<\/a><\/p>\n<p>MSDN Documentation of SteamAnalytics Powershell cmdlets<br \/>\n<a href=\"https:\/\/msdn.microsoft.com\/en-us\/library\/mt603479.aspx\" target=\"_blank\">https:\/\/msdn.microsoft.com\/en-us\/library\/mt603479.aspx<\/a><\/p>\n<p>Azure Documentation (that makes a level 100 effort explaining this)<br \/>\n<a href=\"https:\/\/azure.microsoft.com\/en-us\/documentation\/articles\/stream-analytics-monitor-and-manage-jobs-use-powershell\/\" target=\"_blank\">https:\/\/azure.microsoft.com\/en-us\/documentation\/articles\/stream-analytics-monitor-and-manage-jobs-use-powershell\/<\/a><\/p>\n<p>StreamAnalytics &#8211; Getting Started documentation<br \/>\n<a href=\"https:\/\/azure.microsoft.com\/en-us\/documentation\/articles\/stream-analytics-get-started\/\" target=\"_blank\">https:\/\/azure.microsoft.com\/en-us\/documentation\/articles\/stream-analytics-get-started\/<\/a><\/p>\n<p><strong>Source Code<\/strong><\/p>\n<p>Powershell script can be downloaded here<br \/>\n<a href=\"https:\/\/github.com\/cljung\/az-streaming-analytics\">https:\/\/github.com\/cljung\/az-streaming-analytics<\/a><\/p>\n<p>How to use it:<\/p>\n<ol>\n<li>Design your StreamAnalytics job in the Azure portal<\/li>\n<li>export the config via<br \/>\n.\\Deploy-StreamingAnalytics.ps1 export -JobName &lt;my-name&gt;<\/li>\n<li>delete the StreamAnalytics job via<br \/>\n.\\Deploy-StreamingAnalytics.ps1 stop -JobName &lt;my-name&gt;<br \/>\n.\\Deploy-StreamingAnalytics.ps1 delete\u00a0-JobName &lt;my-name&gt;<\/li>\n<li>Possibly edit the &lt;my-name&gt;-datasources.json file<\/li>\n<li>(re)Create the StreamAnalytics job via<br \/>\n.\\Deploy-StreamingAnalytics.ps1 create\u00a0-JobName &lt;my-name&gt;<\/li>\n<\/ol>\n","protected":false},"excerpt":{"rendered":"<p>StreamAnalytics is a key componant if you plan to do an analytics or IoT solution in Azure. It is a really powerful resource once you get it\u00a0configured. However auomating its deployment is a topic that isn&#8217;t covered alot and there are currently zero templates in the Azure quick start github repo and there are only [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[395,101],"tags":[396,71,397],"_links":{"self":[{"href":"https:\/\/blog.redbaronofazure.com\/index.php?rest_route=\/wp\/v2\/posts\/7053"}],"collection":[{"href":"https:\/\/blog.redbaronofazure.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blog.redbaronofazure.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blog.redbaronofazure.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/blog.redbaronofazure.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=7053"}],"version-history":[{"count":11,"href":"https:\/\/blog.redbaronofazure.com\/index.php?rest_route=\/wp\/v2\/posts\/7053\/revisions"}],"predecessor-version":[{"id":7119,"href":"https:\/\/blog.redbaronofazure.com\/index.php?rest_route=\/wp\/v2\/posts\/7053\/revisions\/7119"}],"wp:attachment":[{"href":"https:\/\/blog.redbaronofazure.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=7053"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blog.redbaronofazure.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=7053"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blog.redbaronofazure.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=7053"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}