Which python and Which virtualenv are ok.
But, when I execute this :
airflow test export_daily push_fb_style_plan_optout 2018-05-24
I have this result : "
"File "/usr/local/bin/airflow", line 17, in <module>
from airflow import configurationImportError: cannot import name configuration"
Are you some ideas ?
I'm trying to create a single Apache Airflow instance that can host multiple users, that will be unable to affect one another negatively. This would include but not limited to:
- Maintain their own python environments
- Manage their own variables
- Manage their own connections
- Manage their own dags
Looking through the official airflow docs. I see a couple things that may help.
1) ability to create users and 2) the ability to be multi tenant.
1) If I follow the user creation process https://airflow.apache.org/security.html#web-authentication. All the users created seem to be admin, but how do I create a non admin user and control what can they do / not do? I can't seem to locate anymore documentation.
2) The link, https://airflow.apache.org/security.html#multi-tenancy, says that "You can filter the list of dags in webserver by owner name when authentication is turned on by setting ", but I don't see how I can assign dags to specific users.
Thanks for the help.
I am new to Apache Airflow.
I am able to configure
airflow.cfg file to run tasks one after the other.
What I want to do is, execute tasks in parallel, e.g. 2 at a time and reach the end of list.
Please let me know, how do I configure the same.
Hello everyone I'm playing with Airflow, I'm reading this helpful tutorial. I'm asking help to understand better how Admin->Connection works regarding Conn Type: File (path).
I suppose this type of connection is to have local filesystem folder accessible by my operator?
I follow the Airflow documentation.
pip install airflow
After installing all packages and running
airflow: command not found.
Where I went wrong? Any help would be appreciated.
I'm trying to figure out a way to test a DAG where I have a couple of tasks communicating using XCom.
Since the console command only allow me to run tasks from a DAG, is there a way to test the communication without having to run the DAG via the UI?
Is it possible to customize the format that Airflow uses for logging?
I tried adding a LOG_FORMAT variable in $AIRFLOW_HOME/airflow.cfg, but it doesn't seem to take effect
LOG_FORMAT = "%(asctime)s logLevel=%(levelname)s logger=%(name)s - %(message)s"
I'm trying to create a DB2 / DashDB connection using the Airflow UI. I have added the db2jcc4.jar driver and provided the path as well as the class name com.ibm.db2.jcc.DB2Driver.class
I tried to run a simple query (in the ad hoc UI) and always get the same error
java.lang.RuntimeException: Class com.ibm.db2.jcc.DB2Driver.class not found
Did anybody need to setup a DB2 / DashDB connection in Apache Airflow before?
Found nothing on the web about that.
when I put a new DAG python script in the dags folder, I can view a new entry of DAG in the DAG UI but it was not enabled automatically. On top of that, it seems does not loaded properly as well. I can only click on the Refresh button few times on the right side of the list and toggle the on/off button on the left side of the list to be able to schedule the DAG. These are manual process as I need to trigger something even though the DAG Script was put inside the dag folder.
Anyone can help me on this ? Did I missed something ? Or this is a correct behavior in airflow ?
By the way, as mentioned in the post title, there is an indicator with this message "This DAG isn't available in the webserver DagBag object. It shows up in this list because the scheduler marked it as active in the metdata database" tagged with the DAG title before i trigger all this manual process.