这很简单 !不需要 DockerFile、KubernetesPodOperator、LD_LIBRARY_PATH 等,只需一个基本的蟒蛇运算符 will do
需要考虑的要点
- GCP Composer Worker 的 Pod 映像是 ubuntu 1604(只需使用命令 os.system('cat /etc/os-release') 运行基本的 python 操作符来检查)
- 它已在工作节点的 pod 映像上安装了 unixodbc-dev
- Composer 创建水桶并用气流安装它
- 那么为什么不直接从 pypi 包安装 pyodbc 并在 pyodbc 连接方法中提供 mssql odbc 驱动程序作为参数
这里 'gs://bucket_created_by_composer' == '/home/airflow/gcs'
gcs bucket created by composer ->
-> data/
-> dags/
循序渐进的方法
Step 1:在任何 ubuntu 实例上安装 pyodbc、mssql odbc 以获取驱动程序文件
出于考虑,让我们在带有 ubuntu 1804 映像的 GCP VM 实例上执行此操作
#update the packages
sudo apt update
sudo apt-get update -y
curl https://packages.microsoft.com/keys/microsoft.asc | sudo apt-key add -
curl https://packages.microsoft.com/config/ubuntu/18.04/prod.list | sudo tee /etc/apt/sources.list.d/msprod.list
sudo apt-get update -y
echo Installing mssql-tools and unixODBC developer...
sudo ACCEPT_EULA=Y apt-get install -y mssql-tools unixodbc-dev
sudo apt-get update -y
sudo apt-get install -y mssql-tools #it includes sql_cmd and bcp (we dont need those)
sudo apt install python3-pip #installing pip3
pip3 install pyodbc
Step 2:获取驱动文件并上传到composer创建的gcs_bucket的data文件夹中
cd /opt/microsoft
#now you can see there is one directory 'msodbcsql17', version may change
#we need to upload this directory to the data folder of gcs_bucket
#for this you may choose which ever approach suits you
#copying the directory to /<home/user> for proper zipping/uploading to gcs
cp -r msodbcsql17 /home/<user> #you may need to use sudo
#upload this /home/<user>/msodbcsql17 to any gcs_bucket
gsutil cp -r /home/<user>/msodbcsql17 gs://<your-gcs-bucket>
从gcs存储桶下载此文件夹到本地,并将此文件夹上传到composer创建的gcs存储桶的数据文件夹
选择任何途径/方法,主要目的是获取由composer创建的gcs存储桶的数据文件夹中的msodbcsql17文件夹
最终结构:
gcs bucket created by composer ->
-> data/msodbcsql17/
-> dags/<your_dags.py>
Step 3:使用此 msodbcsql17 驱动程序进行 pyodbc 连接
DAG 示例:
import os
import time
import datetime
import argparse
import json
from airflow import DAG
import airflow
from airflow.operators import python_operator
default_dag_args = {
'start_date': airflow.utils.dates.days_ago(0), #
'provide_context': True
}
dag = DAG(
'pyodbc_test',
schedule_interval=None, #change for composer
default_args=default_dag_args
)
def check_connection(**kwargs):
print('hello')
driver='/home/airflow/gcs/data/msodbcsql17/lib64/libmsodbcsql-17.5.so.2.1'
#this is the main driver file, the exact location can be found on gcs_bucket/data folder or check the /etc/odbcinst.in file of ubuntu instance in which you installed the pyodbc earlier
def tconnection(ServerIp,LoginName,Password,mssql_portno):
""" A method which return connection object"""
import pyodbc
pyodbc.pooling = False
try:
sql_conn = pyodbc.connect("DRIVER={4};SERVER={0},{1};UID={2};PWD={3}".format(ServerIp,mssql_portno,LoginName,Password,driver))
except pyodbc.Error as ex:
sqlstate = ex.args[1]
raise
return sql_conn
con=tconnection('<your-server-ip>','<your-login-name>','<your-password>','1433')
#recommendation is to take the password and login from airflow connections
import pandas as pd
q='select * from <your-db-name>.<your-schema-name>.<your-table-name>'
df=pd.read_sql(q,con)
print(df)
Tcheck_connection= python_operator.PythonOperator(
task_id='Tcheck_connection',
python_callable=check_connection,
dag=dag )
#calling the task sequence
Tcheck_connection
PYPI 包
pyodbc
pandas
最近在 Composer 上进行了测试