TechBrothersIT
TechBrothersIT
  • 1 913
  • 29 168 286
How to read Multiple CSV files from GCS Bucket in BigQuery by using External Table
In this video, we will learn how to read multiple CSV files from a Google Cloud Storage (GCS) bucket in BigQuery by using an external table. BigQuery is a powerful tool that allows you to access, analyze, visualize, and share billions of rows of data from your spreadsheet with Connected Sheets, the new BigQuery data connector.
To get started, you need to meet all of the following requirements: Access to the Google Cloud platform, BigQuery access, and a project with billing setup in BigQuery. Once you have met these requirements, you can create an external table in BigQuery that references the CSV files in your GCS bucket. You can then query the external table as if it were a native BigQuery table.
To create an external table, you need to specify the schema of the CSV files and the location of the GCS bucket. You can also specify other options such as the delimiter, the number of header rows, and the file format. Once you have created the external table, you can query it using standard SQL syntax.
Using external tables has several advantages over loading data into BigQuery. First, external tables allow you to query data without incurring the cost of loading it into BigQuery. Second, external tables allow you to query data in real-time, so you can always access the latest data in your GCS bucket. Finally, external tables allow you to query data across multiple GCS buckets and projects.
#bigquery #gcp #googlecloudplatform
Переглядів: 764

Відео

How to read Multiple Google Sheets from Google Spreadsheet in BigQuery | BigQuery External Table
Переглядів 4127 місяців тому
In this video, we will learn how to read multiple Google Sheets from a Google Spreadsheet in BigQuery. BigQuery is a powerful tool that allows you to access, analyze, visualize, and share billions of rows of data from your spreadsheet with Connected Sheets, the new BigQuery data connector. To get started, you need to meet all of the following requirements: Access to the Google Cloud Platform, B...
How to read data from Google Sheet Using Big Query | Big Query External Table
Переглядів 5727 місяців тому
In this video, we will learn how to read data from Google Sheets using Big Query. Big Query is a powerful tool that allows you to access, analyze, visualize, and share billions of rows of data from your spreadsheet with Connected Sheets, the new BigQuery data connector. To get started, you need to meet all of the following requirements: Access to the Google Cloud Platform, BigQuery access, and ...
How to upgrade GCP Postgresql from old version to new version by using Database Migration DMS in GCP
Переглядів 3288 місяців тому
In this tutorial, we will learn how to upgrade Google Cloud Platform (GCP) PostgreSQL from an old version to a new version by using Database Migration Service (DMS) in GCP. We will start by discussing the importance of upgrading your database and the benefits of using DMS. Then, we will walk through the steps required to upgrade your PostgreSQL instance in-place using DMS. We will cover how to ...
How to Get the List of all Files with Size,Modified and Path from GCS Bucket and Load into BigQuery
Переглядів 4398 місяців тому
In this video, you will learn how to get the list of all files with their size, modified date, and path from a Google Cloud Storage (GCS) bucket and load them into BigQuery. We will guide you through the process step by step, starting with setting up the necessary permissions and credentials for accessing GCS and BigQuery. By the end of this video, you will have a clear understanding of how to ...
How to use Bing AI to write SQL Server Queries as SQL Server Developer or SQL Server DBA
Переглядів 1,4 тис.8 місяців тому
In this video, you will learn how to use Bing AI to write SQL Server queries as a SQL Server Developer or SQL Server DBA. We will introduce you to some of the most popular AI-powered SQL query builders, including AI2sql and AirOps. You will learn how to use these tools to write efficient, error-free SQL queries without knowing SQL. You will learn how to optimize your queries by selecting the ri...
How to Read Data from GCS Google Cloud Storage Bucket to Azure Blob Storage | Azure Data Factory
Переглядів 7008 місяців тому
In this video, we will explore how to read data from a Google Cloud Storage (GCS) bucket and transfer it to Azure Blob Storage using Azure Data Factory. In this video tutorial, we'll walk you through the step-by-step process of setting up the necessary authorization credentials and copy data from GCS to Azure Blob Storage. We'll also cover any differences in bucket and object naming rules betwe...
How to Read the Data from BiqQuery Table and Write to CSV File in Blob Storage | Azure Data Factory
Переглядів 1,4 тис.9 місяців тому
In this tutorial, you will learn how to extract data from a BigQuery table and save it as a CSV file in Azure Blob Storage using Azure Data Factory. The video will guide you through the process step-by-step, covering the necessary configurations and settings. The tutorial will cover the following topics: Setting up Azure Data Factory and connecting it to your BigQuery project. Creating a pipeli...
How to Activate and Deactivate Activities in Azure Data Factory Step-by-Step Tutorial-ADF Tutorial
Переглядів 8759 місяців тому
How to Activate and Deactivate Activities in Azure Data Factory | Azure Data Factory Tutorial 2023, in this video we are going to learn How to Activate and Deactivate Activities in Azure Data Factory, Azure Data Factory Step by Step - ADF Tutorial 2023 - ADF Tutorial 2023 Step by Step ADF Tutorial - Azure Data Factory Tutorial 2023. Azure Data Factory Tutorial for Beginners Azure Data Factory T...
How to find out Who has created the GCP project in Google Cloud Platform
Переглядів 2,4 тис.Рік тому
In this video, I will show you how to find out who has created a GCP project in Google Cloud Platform. A GCP project is a logical collection of your resources in GCP, like a subscription in Azurelearn.microsoft.com/en-us/azure/active-directory/cloud-infrastructure-entitlement-management/onboard-gcp. You can use the Google Cloud console, the gcloud command-line tool, or the Cloud Resource Manage...
Unroll Multiple Arrays from JSON File in a Single Flatten Step in Azure Data Factory | ADF Tutorial
Переглядів 4,2 тис.Рік тому
Unroll Multiple Arrays in a Single Flatten Step in Azure Data Factory | ADF Tutorial 2023, in this video we are going to learn How to Unroll Multiple Arrays in a Single Flatten Step in Azure Data Factory | ADF Tutorial 2023, Azure Data Factory Step by Step - ADF Tutorial 2023 - ADF Tutorial 2023 Step by Step ADF Tutorial - Azure Data Factory Tutorial 2023. Video Link: ua-cam.com/video/zosj9UTx7...
SQL Queries and Google BARD AI -Testing Bard for SQL Queries, ADF and Python Artificial Intelligence
Переглядів 1,4 тис.Рік тому
In this video, I will be experimenting with Bard, a large language model from Google AI. Bard is trained on a massive dataset of text and code, and can generate text, translate languages, write different kinds of creative content, and answer your questions in an informative way. In this video, I will be using Bard to experiment with SQL queries, Python, and Azure Data Factory. I will be using B...
How to perform CDC from PostgreSQL to Big Query by using DataStream in Google Cloud Platform
Переглядів 3,3 тис.Рік тому
How to perform CDC from PostgreSQL to Big Query by using DataStream in Google Cloud Platform | Google Cloud Platform Tutorial 2023, in this video we are going to learn How to perform CDC from PostgreSQL to Big Query by using DataStream in Google Cloud Platform| Google Cloud Platform SQL Tutorial 2023, Google Cloud Platform Step by Step - GCP Tutorial 2023 - GCP Tutorial 2023 Step by Step - Goog...
Set Pipeline Return Value in Azure Data Factory-How to Pass Values between Two ADF Pipelines
Переглядів 4,1 тис.Рік тому
Set Pipeline Return Value in Azure Data Factory How to Pass Values between Two ADF Pipelines | Azure Data Factory Tutorial 2023, in this video we are going to learnSet Pipeline Return Value in Azure Data Factory How to Pass Values between Two ADF Pipelines | Azure Data Factory Tutorial | ADF New Features 2023 Azure Data Factory Tutorial | ADF Tutorial 2023, Azure Data Factory Step by Step - ADF...
How to Use White or Dark Theme in Azure Data Factory Studio | Azure Data Factory Tutorial 2023
Переглядів 349Рік тому
How to Use White Azure or Dark Theme in Azure Data Factory Studio | Azure Data Factory Tutorial 2023, in this video we are going to learn How to Use White Azure or Dark Theme in Azure Data Factory Studio | Azure Data Factory Tutorial | ADF New Features 2023 Azure Data Factory Tutorial | ADF Tutorial 2023, Azure Data Factory Step by Step - ADF Tutorial 2022 - ADF Tutorial 2023 Step by Step ADF T...
How to use List or Container Monitoring View in Azure Data Factory Studio ADF Tutorial 2023
Переглядів 499Рік тому
How to use List or Container Monitoring View in Azure Data Factory Studio ADF Tutorial 2023
How to upgrade GCP MySQL Instance by using Database Migration Service-Reduce GCP MySQL Instance Disk
Переглядів 921Рік тому
How to upgrade GCP MySQL Instance by using Database Migration Service-Reduce GCP MySQL Instance Disk
How to Connect MySQL Workbench or Heidi SQL to Google Cloud SQL Locally Using Cloud SQL Proxy
Переглядів 2,3 тис.Рік тому
How to Connect MySQL Workbench or Heidi SQL to Google Cloud SQL Locally Using Cloud SQL Proxy
How to Connect to BigQuery from Tableau by using GCP Service Account GCP Tutorial 2022
Переглядів 3,5 тис.Рік тому
How to Connect to BigQuery from Tableau by using GCP Service Account GCP Tutorial 2022
How to perform CDC from GCP MySQL Instance to BigQuery by using DataStream in Google Cloud Platform
Переглядів 3,3 тис.Рік тому
How to perform CDC from GCP MySQL Instance to BigQuery by using DataStream in Google Cloud Platform
How to Perform Cross Database Queries in PostgreSQL in GCP Cloud | GCP Cloud SQL Tutorial 2022
Переглядів 2,3 тис.Рік тому
How to Perform Cross Database Queries in PostgreSQL in GCP Cloud | GCP Cloud SQL Tutorial 2022
Cross database query between Google SQL instances PostgreSQL | GCP SQL Tutorial 2022
Переглядів 452Рік тому
Cross database query between Google SQL instances PostgreSQL | GCP SQL Tutorial 2022
How to Stop GCP PostgreSQL from Logging Passwords in Clear Text in Logs GCP Cloud SQL Tutorial
Переглядів 340Рік тому
How to Stop GCP PostgreSQL from Logging Passwords in Clear Text in Logs GCP Cloud SQL Tutorial
How to Perform In place Upgrade to GCP SQL Instance | Inplace Upgrade PostgreSQL 12 to PostgreSQL 14
Переглядів 955Рік тому
How to Perform In place Upgrade to GCP SQL Instance | Inplace Upgrade PostgreSQL 12 to PostgreSQL 14
How to use Cast Transformation in Data Flow Task | Azure Data Factory Tutorial 2022
Переглядів 1,2 тис.Рік тому
How to use Cast Transformation in Data Flow Task | Azure Data Factory Tutorial 2022
How to Drop User or Role in PostgreSQL Instance on Google Cloud Platform | GCP Tutorials 2022
Переглядів 644Рік тому
How to Drop User or Role in PostgreSQL Instance on Google Cloud Platform | GCP Tutorials 2022
How to Check if Value Exists in Input Column | Contains Function & InStr Function in Data Flow | ADF
Переглядів 2,4 тис.Рік тому
How to Check if Value Exists in Input Column | Contains Function & InStr Function in Data Flow | ADF
How to Stop and Start SQL Instances on Schedule by using Cloud Schedule in GCP | GCP SQL Tutorial
Переглядів 1,5 тис.Рік тому
How to Stop and Start SQL Instances on Schedule by using Cloud Schedule in GCP | GCP SQL Tutorial
How to Schedule Maintenance with PostgreSQL pg cron Extension | Google Cloud Platform SQL Tutorial
Переглядів 1,7 тис.Рік тому
How to Schedule Maintenance with PostgreSQL pg cron Extension | Google Cloud Platform SQL Tutorial
How to Connect to GCP SQL Server Instance by using Private IP Configuration | GCP SQL Tutorial 2022
Переглядів 1,8 тис.Рік тому
How to Connect to GCP SQL Server Instance by using Private IP Configuration | GCP SQL Tutorial 2022

КОМЕНТАРІ

  • @Khader_views
    @Khader_views 11 годин тому

    Apart from web activity do we have any other option activity to bring Ur key vault

  • @ayushikankane530
    @ayushikankane530 День тому

    If csv file is hqving some columna as json structure than how to proceed?

  • @sri929
    @sri929 День тому

    Hi TechBrothersIT, Thanks for the Video. I have some new scenario. How to Load CSV files with more columns than destination Table. For example. If my CSV file having few additional columns which are not available in the destination table then it should skip those columns and insert data only into available columns in the Destination Table. ? Could you pls help on this

  • @ericsos101
    @ericsos101 День тому

    What about if the column are delimited by “/“ how to come up with a data row lengths

  • @subhashreechhualsingh8174
    @subhashreechhualsingh8174 2 дні тому

    Hello, I have one question. Suppose I have a complex sql query. Now I want to create one pipeline where I can store the output of this sql query as parameters , so that other pipelines who will call this pipeline can use these values. How to achieve that in adf?

  • @subhashreechhualsingh8174
    @subhashreechhualsingh8174 2 дні тому

    Hello, I have one question. Suppose I have a complex sql query. Now I want to create one pipeline where I can store the output of this sql query as parameters , so that other pipelines who will call this pipeline can use these values. How to achieve that in adf?

  • @stevesmith1678
    @stevesmith1678 2 дні тому

    is it compulsory to change synchronous commit to async, instead of that suspend data movement on secondary and apply patch on secondary?

  • @komalmishra5236
    @komalmishra5236 3 дні тому

    Sir, u are one of the test tutor that i found on youtube. Everytime I used to watch your videos. It really helped me throughout my career growth. I am following u since 5 years.

    • @TechBrothersIT
      @TechBrothersIT 2 дні тому

      Glad to hear that you liked my effort and it is helpful. appreciate you left such nice comments

  • @williamtenhoven8405
    @williamtenhoven8405 5 днів тому

    What would be the 'truncate script' in the 'Pre-copy script' section ? Can't get this right. I have @concat('delete from @dataset().dsTableName') but this does not work.

  • @abamyehdgo
    @abamyehdgo 5 днів тому

    Nice tutorial as always. So when you run the pipeline again after adding North America you are not get duplicate value for Asia and Europe why ?

  • @NicolasMuriel-xo3wf
    @NicolasMuriel-xo3wf 5 днів тому

    Hello, I think is not possible to set it to write mode append only ?

  • @sjt20690
    @sjt20690 6 днів тому

    I see that you have not locked the Storage account behind the ptivate vnet. Is the Storage account being used over a public IP in this workflow?

  • @tommytammy6137
    @tommytammy6137 6 днів тому

    Straight to the point. Thank you!

  • @user-iw2rl9qy1f
    @user-iw2rl9qy1f 6 днів тому

    Even in 2024 these videos are the best to watch if u want to be an expert in SSIS PAKAGE DEPLOYMENTS. THANKS SIR for all this great knowledge sharing

  • @AngelVivekKerkettaA
    @AngelVivekKerkettaA 7 днів тому

    Thanks for guidelines

  • @GauravKiRachna
    @GauravKiRachna 7 днів тому

    Can we have the dynamically change. Like user has the access to view 20, 50, or say 100 rows in that report. Is this possible?

  • @AndreasLenski-ni9mh
    @AndreasLenski-ni9mh 7 днів тому

    Does it work with Shared Folders as well? I get an error: "Exception: Cannot retrieve the next object: iterator has reached the end."

  • @ZoneZae
    @ZoneZae 10 днів тому

    how often do you recommend backing up tlog?

  • @user-ed6ic4th8o
    @user-ed6ic4th8o 12 днів тому

    You got an accent which sounds like a mix between Indian and American, where are you from?

    • @TechBrothersIT
      @TechBrothersIT 11 днів тому

      Yes. you are right, I am from Pakistan but living in USA from many years ago

  • @GauravSharma-os6ds
    @GauravSharma-os6ds 12 днів тому

    Very Informative video, Thats the real world scenario you covered.

  • @user-nv4ek6te4b
    @user-nv4ek6te4b 12 днів тому

    hi ,my ssis package is sucessfully loading data but in windows log event viewer getting message like 'package.dtsx' got failed .what might be the the reason?please help

  • @frederikceglarek
    @frederikceglarek 12 днів тому

    Wonderful explained!

  • @kunaldhiman8733
    @kunaldhiman8733 13 днів тому

    Can we do it without input file?

  • @marioanzaldua8020
    @marioanzaldua8020 13 днів тому

    The package runs successfully but no data is transferred. Execution results says: [Flexible File Source] Error: Cannot find the file specified. But the connection manager is good and it finds all the columns.

  • @fidelixwashington
    @fidelixwashington 13 днів тому

    I did exacly the same shown in the video and it worked with 17 files. Great content!

  • @zodiakzodiak1021
    @zodiakzodiak1021 14 днів тому

    благодарю, вы очень помогли данным видео, желаю Вам всего хорошего в жизни и побольше подписчиков))

  • @aniketsakpal8174
    @aniketsakpal8174 14 днів тому

    Thankyou so much🎉

  • @randomguy9241
    @randomguy9241 15 днів тому

    Thank you!

  • @etsegenetadugna8936
    @etsegenetadugna8936 15 днів тому

    Do we need to restart the SQL to take effect when releasing tempdb space?

  • @shanmugavadivelArts
    @shanmugavadivelArts 15 днів тому

    nice

  • @gauravpratap4482
    @gauravpratap4482 17 днів тому

    Bur here partition by is not applied

  • @litreyal4348
    @litreyal4348 18 днів тому

    thank u bro

  • @connectwithkarthicksurya3678
    @connectwithkarthicksurya3678 18 днів тому

    Very good work

  • @lax976
    @lax976 18 днів тому

    Why fake accent

    • @TechBrothersIT
      @TechBrothersIT 17 днів тому

      it is mixed. not trying to make it like that

  • @megharaina6108
    @megharaina6108 20 днів тому

    What happens if the sql server is not in same region or same resource group or subscription ?

  • @alifakhraee
    @alifakhraee 20 днів тому

    Very helpful! Thanks a lot!

  • @MaDmAxLoCaL
    @MaDmAxLoCaL 20 днів тому

    I want to store data in variable for larger file, but web activity have only 4MB limitation, please help me here

  • @MaDmAxLoCaL
    @MaDmAxLoCaL 20 днів тому

    I was getting this error: The length of execution output is over limit (around 4MB currently)

  • @java_interview
    @java_interview 23 дні тому

    Thank you so much 😀

  • @user-rz8yv4vk1r
    @user-rz8yv4vk1r 24 дні тому

    Thanks bro this is useful.

  • @prakashnayak61
    @prakashnayak61 25 днів тому

    Thanks a lot.... Badly needed this.👍

  • @Mendela2019
    @Mendela2019 25 днів тому

    mariadb database backup and restore tutorial??

  • @SynonAnon-vi1ql
    @SynonAnon-vi1ql 26 днів тому

    Hi, question on branching strategy discussed at 16:49 . I have seen several videos on UA-cam but I don't see a collaboration branch created off of the master branch and then create say Amir's branch. Can you tell why that is not the case? And why Amir's branch is created directly off of the master branch? There might be a possibility of two data engineers working on the same factory so it might make sense to create their own branches off of a collaboration (feature) branch. No? Thank you!

  • @jeffersonsantiago7902
    @jeffersonsantiago7902 Місяць тому

    I am planning to migrate my databases from my old version of mysql (5.0 series) to 8.0 series. Is it okay to use the backup created by mysqldump? or is it more advisable to use the "Migration Wizard"?

  • @jeffersonsantiago7902
    @jeffersonsantiago7902 Місяць тому

    Hi! Is it compatible to migrate my database from old version such as 5.6.23 \ 5.7.17 \ 5.7.39 \ to 8.0.37 using "Schema Transfer Wizard"? If yes, the collation of 5.0 versions are utf8_general_ci does this means it will change also to utf8mb3_general_ci?

  • @TheRajasekar03
    @TheRajasekar03 Місяць тому

    Really great

  • @k.a.v.y.a.4-1-9-8
    @k.a.v.y.a.4-1-9-8 Місяць тому

    Worst video. Please don't watch & waste your time

  • @SujeethMaroli--BOSCH
    @SujeethMaroli--BOSCH Місяць тому

    Not justified 🙄

  • @kemidiramesh200
    @kemidiramesh200 Місяць тому

    Hi I written below expressions =Round(Sum(Fields!Average_Days_from_Flooring_to_Title_in_Vault.Value, "YTD") / CountRows("YTD"), 0) =Round(Sum(Fields!Average_Days_from_Title_at_Auction_to_Vault, "YTD") / Sum(IIf(Fields!Average_Days_from_Title_at_Auction_to_Vault.Value > 0, 1, 0), "YTD"), 0) Report executed successfully but am not getting the values. Please correct where I did the wrong.!

  • @mulshiwaters5312
    @mulshiwaters5312 Місяць тому

    can we create schedule to copy files periodically ?