
Đã đóng
Đã đăng vào
Thanh toán khi bàn giao
The task centres on moving content that currently lives in Azure Blob Storage into two destinations—SharePoint Online and a third-party cloud repository that we will finalise together. The source consists of mixed-format files: Word documents, images, CSVs and a few other common types. I need a lightweight yet robust ETL solution that can: • automatically detect new or updated blobs, • perform any required transformations (basic metadata enrichment, naming conventions, optional compression where helpful), and • deliver the output to the target platforms with solid error handling and retry logic. An Azure-native approach—Azure Data Factory, Logic Apps, Functions, or a combination—will fit best, but I’m open to other suggestions so long as the pipeline runs securely inside my Azure subscription and can be scheduled or triggered on demand. Clear logging is essential; I should be able to see job status, successes and failures at a glance. Deliverables 1. Source-controlled code / ARM or Bicep templates to deploy the pipeline. 2. A concise read-me that explains configuration steps and environment variables. 3. A short screen-share hand-off (recorded) showing the pipeline running end-to-end on my tenant. Acceptance criteria • All blobs placed in the test container appear in the correct SharePoint document library and the chosen third-party cloud folder with matching metadata. • Failed transfers are logged and automatically retried up to three times. • The solution can be re-deployed via template without manual UI steps (aside from authentication secrets). If this scope aligns with your expertise, let’s discuss the implementation details and timelines.
Mã dự án: 40263823
42 đề xuất
Dự án từ xa
Hoạt động 4 ngày trước
Thiết lập ngân sách và thời gian
Nhận thanh toán cho công việc
Phác thảo đề xuất của bạn
Miễn phí đăng ký và cháo giá cho công việc
42 freelancer chào giá trung bình £4.120 GBP cho công việc này

With over 10 years of experience in web and mobile development, specializing in Azure solutions, I understand the challenge you face in moving content from Azure Blob Storage to SharePoint and another cloud repository efficiently. Your need for a lightweight yet robust ETL solution with automatic detection of new or updated blobs, transformation capabilities, and solid error handling align perfectly with my expertise. I have successfully implemented similar projects in the past, particularly in the realm of Azure-native solutions. My experience in creating pipelines using Azure Data Factory, Logic Apps, and Functions ensures that your data migration process will run securely within your Azure subscription and meet all your requirements. I am confident in delivering the source-controlled code/templates for deployment, a comprehensive read-me guide for configuration, and a detailed screen-share hand-off to showcase the pipeline in action on your tenant. Rest assured, I will meet all acceptance criteria outlined in the project description. If you believe I am the right fit for your project, I encourage you to reach out to discuss further implementation details and timelines. Let's work together to bring your vision to life.
£4.000 GBP trong 45 ngày
6,8
6,8

Hello, We've reviewed your project details regarding the Azure Blob ETL to SharePoint task and are eager to bring our expertise to your project. We understand you require a seamless ETL pipeline that ensures content is efficiently moved from Azure Blob Storage to SharePoint Online and another cloud repository, with robust error handling and logging. We've successfully executed similar ETL solutions using Azure-native tools like Data Factory and Logic Apps, integrating metadata enrichment and compression. Our approach has always been to create secure, scalable pipelines that align with your current infrastructure and enable easy monitoring and troubleshooting. Our team has over eight years of experience as top-rated experts in intelligent systems and automation. Our proficiency in Azure, SharePoint, and cloud integrations ensures we can deliver a solution that not only meets but exceeds your expectations. Our previous projects have demonstrated our ability to build solutions with unmatched quality, transparency, and reliability. We invite you to message us with more details so we can craft a detailed proposal within 24 hours to tailor-fit your requirements. We’re excited to collaborate and build something impactful together. Best regards, Puru Gupta
£5.000 GBP trong 30 ngày
6,7
6,7

Hello, You’re looking to move files from Azure Blob Storage into SharePoint Online and another cloud repository using a lightweight but reliable ETL pipeline. I have experience building Azure-native data pipelines using Azure Functions, Logic Apps, and Azure Data Factory for file ingestion, transformation, and delivery to systems like SharePoint and cloud storage services. My focus is on reliable automation with structured logging, retry handling, and infrastructure-as-code so deployments remain consistent and maintainable. . Implementation approach • Detect new or updated blobs using event triggers or last-modified tracking • Process files through a lightweight transformation layer (metadata enrichment, naming conventions, optional compression) • Deliver files to SharePoint document libraries and the selected cloud repository • Implement retry logic with up to three attempts and clear logging of failures • Deploy using ARM/Bicep templates with source-controlled code • Provide monitoring visibility so job status, successes, and failures are easy to review Before starting, I’d just confirm two details: 1. Approximately how many blobs and what average file size should the pipeline handle daily? 2. Should blob detection be event-based (Blob trigger) or strictly incremental using last-modified timestamps? Regards, Muhammad Shoaib
£3.000 GBP trong 7 ngày
6,7
6,7

Hello there, This aligns perfectly with my Azure integration experience. I would design an Azure-native ETL pipeline using Azure Data Factory for orchestration, combined with Azure Functions for metadata enrichment, file renaming, and optional compression. Blob change detection can be handled via Event Grid triggers or scheduled ADF pipelines. For SharePoint Online, I’d use Microsoft Graph API with managed identity where possible; the third-party repository would integrate via secure REST connector. The solution will include structured logging (Application Insights + Log Analytics), retry logic (up to three attempts), and clear failure reporting. All resources will be deployed via Bicep templates, fully source-controlled, with a concise README and recorded end-to-end handoff demonstration Links of my Azure projects. https://www.freelancer.com/projects/cloud-networking/Azure-Migration-for-Secure-Web/reviews https://www.freelancer.com/projects/azure/Azure-API-Development-for-Apps/reviews https://www.freelancer.com/projects/microsoft-azure/Comprehensive-Kiosk-Revamp https://www.freelancer.com/projects/azure/Azure-API-Builder-net-Developer/reviews https://www.freelancer.com/projects/dot-net/Update-Map-Control-Azure-Maps/details Thanks.
£4.000 GBP trong 30 ngày
6,1
6,1

Hi there, I will design a lightweight Azure-native ETL to detect new/updated Azure Blob content, enrich metadata, apply naming rules/compression and deliver files to SharePoint Online and a chosen third-party cloud with secure, retryable delivery. - Build event-driven detection (Event Grid -> Azure Function / Logic App) and/or scheduled ADF pipeline for bulk runs - Implement transformations: metadata enrichment, filename normalization, optional ZIP compression; map metadata to SharePoint columns - Reliable delivery to SharePoint Online (CSOM/Graph) and third-party (S3-compatible or REST) with retry (3x) and dead-letter logging - Provide ARM/Bicep templates, source-controlled C# functions, and logging/monitoring (Application Insights + Log Analytics) Skills: ✅ Azure Blob ✅ Azure Data Factory / Logic Apps / Azure Functions ✅ metadata enrichment / naming conventions / compression ✅ deployment via ARM or Bicep / CI-CD ✅ logging, retry logic, error handling, Application Insights Certificates: ✅ Microsoft® Certified: MCSA | MCSE | MCT ✅ cPanel® & WHM Certified CWSA-2 I can start immediately, deliver templates, code and a recorded hand-off. Which third-party cloud providers do you prefer (e.g., AWS S3, Box, Dropbox, Google Drive, custom REST) and how do you plan to supply authentication (managed identity, service principal, or API key)? Price: $4200 , Delivery: 7 days Best regards,
£4.200 GBP trong 7 ngày
4,8
4,8

Hello, I’m excited about the opportunity to contribute to your project. With my expertise in Azure Functions, Logic Apps, Azure Data Factory, Blob Storage event triggers, SharePoint Online (Microsoft Graph API), and secure Azure automation with ARM/Bicep and Key Vault and a strong focus on clean, scalable implementation, I can deliver a solution that aligns perfectly with your goals. I’ll tailor the work to your exact requirements, ensuring reliable blob change detection, metadata enrichment and naming normalization, optional compression, robust retry/error handling with clear logging, and repeatable infrastructure-as-code deployment inside your Azure subscription. You can expect clear communication, fast turnaround, and a high-quality result that fits seamlessly into your existing workflow. Best regards, Juan
£3.000 GBP trong 7 ngày
4,9
4,9

Hello! I've been recommended by a Freelancer Recruiter. Nice to meet you. I've just completed a similar Azure-native data integration project that involved migrating data from Azure Blob Storage to a cloud repository with custom metadata enrichment. I'm the perfect fit for this project because I have extensive experience with Azure-native solutions, including Azure Data Factory, Logic Apps, and Azure Functions, which will help ensure a seamless and scalable data transfer process. We can leverage Azure Data Factory to create a robust ETL pipeline that automatically detects new or updated blobs, performs transformations, and delivers the output to SharePoint Online and the third-party cloud repository. I can also implement solid error handling and retry logic, as well as clear logging, to provide you with visibility into job status and performance. I've successfully implemented similar pipelines, reducing manual work by up to 80% and ensuring zero downtime for over a year. Multiple 5-star reviews on Azure-native data integration projects, Azure Data Factory, and Azure Functions integrations. Happy to hop on a quick call (no obligation) to discuss architecture, timeline, and a clear plan + quote. Chris | Lead Developer | Novatech
£5.000 GBP trong 7 ngày
3,8
3,8

I’ve built Azure-native ETL pipelines moving document assets between Blob Storage, SharePoint Online, and external repositories with full retry logic, logging, and template-based deployment. Recommended architecture: Azure Function (Python or C#) triggered by Blob events (Event Grid) to detect new or updated files. Transformation layer inside the Function for metadata enrichment, renaming, and optional compression. Delivery to SharePoint via Microsoft Graph API and to the third-party cloud via REST API connector. Retry and dead-letter handling using Azure Storage Queues. Centralized logging in Application Insights for job status, failures, and metrics. Alternative for higher orchestration visibility: Azure Data Factory to manage pipeline flow, with Functions handling transformation logic. Security: Managed identities for SharePoint access. Secrets stored in Azure Key Vault. All resources deployed via Bicep templates for full reproducibility. Deliverables include source-controlled Function code, Bicep templates, configuration README, and recorded walkthrough. Timeline: 4–5 weeks
£4.000 GBP trong 20 ngày
3,2
3,2

Hello, I have 6+ years of hands-on experience building Azure-native data integration and automation pipelines in the software services industry, including Blob Storage migrations, SharePoint Online integrations, and secure ETL workflows with enterprise logging and retry orchestration. Your requirement aligns directly with solutions I’ve delivered for document lifecycle automation inside Azure tenants. ✅ Proposed Technical Approach 1️⃣ Event-Driven ETL Pipeline Azure Data Factory / Functions architecture Blob trigger detection (new & updated files) Incremental processing logic 2️⃣ Transformation Layer Metadata enrichment & naming normalization Optional compression handling File-type aware processing (DOCX, CSV, images) 3️⃣ Multi-Destination Delivery SharePoint Online API integration Third-party cloud connector abstraction Secure authentication via Managed Identity 4️⃣ Reliability & Observability Retry policy (3x automated recovery) Centralized logging (App Insights / Log Analytics) Status dashboards & failure alerts Relevant Projects Enterprise Cloud File Migration ETL Framework Serverless Data Integration System (ADF + Functions) ? I can show demo pipeline code, ARM/Bicep templates, and logging dashboards before we finalize the deal. ✔ Fully redeployable IaC templates ✔ Secure Azure-native architecture ✔ Clean documentation + handoff recording ✔ Production-ready automation Ready to discuss implementation details and timeline — available to start immediately.
£5.000 GBP trong 30 ngày
2,6
2,6

I understand you require a reliable Azure-native ETL pipeline to move mixed-format files from Azure Blob Storage into SharePoint Online and a third-party cloud repository, with automated detection of new or updated blobs, metadata enrichment, and robust error handling. Your need for source-controlled deployment templates and clear logging to monitor job status is well noted. With over 15 years and 200+ projects in cloud and data integration, I specialize in Azure solutions and automation, including Azure Data Factory, Logic Apps, and Functions, ensuring secure, maintainable pipelines within client subscriptions. I will design a lightweight ETL workflow using Azure Data Factory combined with Azure Functions for transformation and metadata enrichment, implementing retry logic and detailed logging accessible via Azure Monitor. The pipeline will be fully deployable through ARM or Bicep templates, with a concise read-me and a recorded hand-off session demonstrating end-to-end execution, aiming for delivery within a 3-4 week timeframe. Let’s discuss the specifics and timeline to align this solution perfectly with your environment.
£3.300 GBP trong 7 ngày
0,7
0,7

Hi, I can help you with this. I am a Business Automation expert who helps teams reduce manual work and streamline operations with 6+ years of experience. I've helped previous clients with similar projects. you can see my profolitio in my profile Let me know your interest, Sincerely, Nicolas
£4.000 GBP trong 7 ngày
2,3
2,3

I am excited about the opportunity to help you create a robust ETL solution for migrating content from Azure Blob Storage to SharePoint Online and a third-party cloud repository. With extensive experience in Azure Data Factory, Logic Apps, and Functions, I can ensure a secure and efficient pipeline that meets your requirements, including automated blob detection, metadata enrichment, and comprehensive logging. My commitment to quality, coupled with a proven track record of successful Azure implementations, will provide you with a reliable and scalable solution. Let's work together to bring your vision to life.
£4.000 GBP trong 7 ngày
0,0
0,0

West End, Woking, United Kingdom
Phương thức thanh toán đã xác thực
Thành viên từ thg 11 14, 2025
£20-250 GBP
£250-750 GBP
£250-750 GBP
$30-250 USD
$30-250 USD
₹12500-37500 INR
$35-60 AUD/ giờ
$15-25 USD/ giờ
₹12500-37500 INR
₹600-1500 INR
£750-1500 GBP
$30-250 USD
€6-12 EUR/ giờ
₹1500-12500 INR
₹12500-37500 INR
$10000-20000 USD
€30-250 EUR
$250-750 USD
₹1500-12500 INR
$10-30 AUD
$250-750 AUD
₹500000-1000000 INR
$30-250 USD