问题描述
触发后,生成 3 个 XML 文件,完成后,将它们 ftp 到一个站点.
Upon a trigger, generate 3 XML files and once complete, ftp them to a site.
我有一个 HTTP 触发器 Azure 函数,它在运行时会构造 3 个 XML 文件并将这些文件保存到 Azure 存储 Blob 容器.由于有多个输出,并且需要控制输出路径/文件名,我使用势在必行绑定方法 并在我的函数中使用 IBinder outputBinder
.这一切都很好.Blob 存储中的输出路径示例是 export/2017-03-22/file-2017-03-22T12.03.02.54.xml
.文件在一个有日期的文件夹里,每个文件名都有时间戳,保证唯一性.
I have a HTTP Trigger Azure function that when run will construct 3 XML files and save these to an Azure Storage Blob container. Because of the multiple outputs, and the need to control the output path/filenames, I use the
imperative binding approach and use the IBinder outputBinder
in my function. This all works just fine. An example of the output path in the blob storage is export/2017-03-22/file-2017-03-22T12.03.02.54.xml
. The file is in a folder with the date, and each filename has the time stamp to ensure uniqueness.
当所有 3 个文件都生成后,我想触发另一个函数,将这些文件通过 sFTP 发送到站点.现在我最初认为我应该使用 blob 触发器,但我不知道如何触发文件名和路径是动态的输入.我在 blob 触发器文档.
When all 3 files are generated, I want to trigger another function that will sFTP these to a site. Now I initially thought that I should use a blob trigger, but I couldn't figure how how to trigger on inputs that whose filenames and paths were dynamic. I coudldn't find such an example in the blob trigger documentation.
然后我想我可以将我的 HTTP 触发器输出到声明性绑定,并将 XML 文件输出到我的 blob 存储中的 outgoing
容器中,我的 blob 触发器可以查看该容器.但是,这也有效,因为我的功能在消费计划中,最多可以有一个 每天 10 分钟处理新 blob.
So then I thought I could have my HTTP Trigger output to a declarative binding and also output the XML files into an outgoing
container in my blob storage which my blob trigger could be looking at. This also works however because my function is on the consumption plan, there can be up to a 10-minute day in processing new blobs.
因此记录在案的替代方法是使用队列触发器.我可以输出到我的队列并很好地触发队列,但是如何将 3 个 XML 流也传递给我的 QueueTrigger 函数?
So the documented alternative is to use a queue trigger. I can output to my queue and have the queue trigger just fine, but how do I also pass the 3 XML streams to my QueueTrigger function?
我想作为后备,我可以发布一个可以包含构建 XML 的 Azure 存储路径的对象,然后使用存储 SDK 获取流并使用它发布到 FTP,但它会更多吗将这些存储 Blob 流作为输入传递给我的 QueueTrigger 是否有效?
I suppose as a fall back, I can post an object that can contain the Azure Storage paths of the constructed XMLs and then use the Storage SDK to fetch the streams and use that to post to the FTP, but would it be more efficient to also pass those Storage Blob streams as an input to my QueueTrigger?
推荐答案
我认为您使用 Queue Trigger 的方法很有意义.我会构建这样的消息
I think your approach with Queue Trigger makes sense. I would construct a message like this
public class QueueItem
{
public string FirstBlobPath { get; set; }
public string SecondBlobPath { get; set; }
public string ThirdBlobPath { get; set; }
}
然后在队列处理函数中使用声明式绑定,类似
and then use declarative binding in the queue processing function, something like
{
"bindings": [
{
"type": "queueTrigger",
"name": "item",
"direction": "in",
"queueName": "myqueue",
"connection":"...",
},
{
"type": "blob",
"name": "file1",
"path": "mycontainer/{FirstBlobPath}",
"connection": "...",
"direction": "in"
},
{
"type": "blob",
"name": "file2",
"path": "mycontainer/{SecondBlobPath}",
"connection": "...",
"direction": "in"
},
{
"type": "blob",
"name": "file3",
"path": "mycontainer/{ThirdBlobPath}",
"connection": "...",
"direction": "in"
}
],
"disabled": false
}
和功能
public static void Run(QueueItem item, Stream file1, Stream file2, Stream file3)
{
}
这篇关于将多个 Blob 输入传递给 QueueTrigger Azure 函数的最佳方式的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持跟版网!