Google Cloud Client Library for Python

Overview

Google Cloud Python Client

Python idiomatic clients for Google Cloud Platform services.

Stability levels

The development status classifier on PyPI indicates the current stability of a package.

General Availability

GA (general availability) indicates that the client library for a particular service is stable, and that the code surface will not change in backwards-incompatible ways unless either absolutely necessary (e.g. because of critical security issues) or with an extensive deprecation period. Issues and requests against GA libraries are addressed with the highest priority.

GA libraries have development status classifier Development Status :: 5 - Production/Stable.

Note

Sub-components of GA libraries explicitly marked as beta in the import path (e.g. google.cloud.language_v1beta2) should be considered to be beta.

Beta Support

Beta indicates that the client library for a particular service is mostly stable and is being prepared for release. Issues and requests against beta libraries are addressed with a higher priority.

Beta libraries have development status classifier Development Status :: 4 - Beta.

Alpha Support

Alpha indicates that the client library for a particular service is still a work-in-progress and is more likely to get backwards-incompatible updates. See `versioning`_ for more details.

Alpha libraries have development status classifier Development Status :: 3 - Alpha.

If you need support for other Google APIs, check out the Google APIs Python Client library.

Libraries

Client Release Level Version
Asset Inventory ga PyPI-google-cloud-asset
AutoML ga PyPI-google-cloud-automl
BigQuery ga PyPI-google-cloud-bigquery
BigQuery Connection ga PyPI-google-cloud-bigquery-connection
BigQuery Data Transfer Service ga PyPI-google-cloud-bigquery-datatransfer
BigQuery Reservation ga PyPI-google-cloud-bigquery-reservation
BigQuery Storage ga PyPI-google-cloud-bigquery-storage
Bigtable ga PyPI-google-cloud-bigtable
Billing ga PyPI-google-cloud-billing
Build ga PyPI-google-cloud-build
Container Analysis ga PyPI-google-cloud-containeranalysis
Data Catalog ga PyPI-google-cloud-datacatalog
Data Loss Prevention ga PyPI-google-cloud-dlp
Dataproc ga PyPI-google-cloud-dataproc
Datastore ga PyPI-google-cloud-datastore
Firestore ga PyPI-google-cloud-firestore
Identity and Access Management ga PyPI-google-cloud-iam
Internet of Things (IoT) Core ga PyPI-google-cloud-iot
Key Management Service ga PyPI-google-cloud-kms
Kubernetes Engine ga PyPI-google-cloud-container
Logging ga PyPI-google-cloud-logging
Monitoring Dashboards ga PyPI-google-cloud-monitoring-dashboards
NDB Client Library for Datastore ga PyPI-google-cloud-ndb
Natural Language ga PyPI-google-cloud-language
OS Login ga PyPI-google-cloud-os-login
Pub/Sub ga PyPI-google-cloud-pubsub
Recommender API ga PyPI-google-cloud-recommender
Redis ga PyPI-google-cloud-redis
Scheduler ga PyPI-google-cloud-scheduler
Secret Manager ga PyPI-google-cloud-secret-manager
Spanner ga PyPI-google-cloud-spanner
Speech ga PyPI-google-cloud-speech
Stackdriver Monitoring ga PyPI-google-cloud-monitoring
Storage ga PyPI-google-cloud-storage
Tasks ga PyPI-google-cloud-tasks
Text-to-Speech ga PyPI-google-cloud-texttospeech
Trace ga PyPI-google-cloud-trace
Translation ga PyPI-google-cloud-translate
Video Intelligence ga PyPI-google-cloud-videointelligence
Vision ga PyPI-google-cloud-vision
AI Platform Notebooks beta PyPI-google-cloud-notebooks
Access Approval beta PyPI-google-cloud-access-approval
Assured Workloads for Government beta PyPI-google-cloud-assured-workloads
Audit Log beta PyPI-google-cloud-audit-log
Billing Budget beta PyPI-google-cloud-billing-budgets
Binary Authorization beta PyPI-google-cloud-binary-authorization
Channel Services beta PyPI-google-cloud-channel
Data Labeling beta PyPI-google-cloud-datalabeling
Dialogflow CX beta PyPI-google-cloud-dialogflow-cx
Document Understanding API beta PyPI-google-cloud-documentai
Error Reporting beta PyPI-google-cloud-error-reporting
Functions beta PyPI-google-cloud-functions
Game Servers beta PyPI-google-cloud-game-servers
Media Translation beta PyPI-google-cloud-media-translation
Memorystore for Memcached beta PyPI-google-cloud-memcache
Phishing Protection beta PyPI-google-cloud-phishing-protection
Private Certificate Authority beta PyPI-google-cloud-private-ca
Pub/Sub Lite beta PyPI-google-cloud-pubsublite
Recommendations AI beta PyPI-google-cloud-recommendations-ai
Retail API beta PyPI-google-cloud-retail
Runtime Configurator beta PyPI-google-cloud-runtimeconfig
Service Directory beta PyPI-google-cloud-service-directory
Talent Solution beta PyPI-google-cloud-talent
Transcoder beta PyPI-google-cloud-video-transcoder
Workflows beta PyPI-google-cloud-workflows
reCAPTCHA Enterprise beta PyPI-google-cloud-recpatchaenterprise
Analytics Admin alpha PyPI-google-analytics-admin
Analytics Data API alpha PyPI-google-analytics-data
Area 120 Tables API alpha PyPI-google-area120-tables
Compute Engine alpha PyPI-google-cloud-compute
DNS alpha PyPI-google-cloud-dns
Data QnA alpha PyPI-google-cloud-data-qna
Grafeas alpha PyPI-grafeas
Resource Manager API alpha PyPI-google-cloud-resource-manager
Security Command Center alpha PyPI-google-cloud-securitycenter
Security Scanner alpha PyPI-google-cloud-websecurityscanner
Web Risk alpha PyPI-google-cloud-webrisk

Example Applications

  • getting-started-python - A sample and tutorial that demonstrates how to build a complete web application using Cloud Datastore, Cloud Storage, and Cloud Pub/Sub and deploy it to Google App Engine or Google Compute Engine.
  • google-cloud-python-expenses-demo - A sample expenses demo using Cloud Datastore and Cloud Storage

Authentication

With google-cloud-python we try to make authentication as painless as possible. Check out the Authentication section in our documentation to learn more. You may also find the authentication document shared by all the google-cloud-* libraries to be helpful.

License

Apache 2.0 - See the LICENSE for more information.

Comments
  • Synthesis failed for google-cloud-python

    Synthesis failed for google-cloud-python

    Hello! Autosynth couldn't regenerate google-cloud-python. :broken_heart:

    Here's the output from running synth.py:

    2021-01-30 05:13:14,117 autosynth [INFO] > logs will be written to: /tmpfs/src/logs/google-cloud-python
    2021-01-30 05:13:19,330 autosynth [DEBUG] > Running: git config --global core.excludesfile /home/kbuilder/.autosynth-gitignore
    2021-01-30 05:13:19,332 autosynth [DEBUG] > Running: git config user.name yoshi-automation
    2021-01-30 05:13:19,335 autosynth [DEBUG] > Running: git config user.email [email protected]
    2021-01-30 05:13:19,337 autosynth [DEBUG] > Running: git config push.default simple
    2021-01-30 05:13:19,340 autosynth [DEBUG] > Running: git branch -f autosynth
    2021-01-30 05:13:19,343 autosynth [DEBUG] > Running: git checkout autosynth
    Switched to branch 'autosynth'
    2021-01-30 05:13:19,350 autosynth [DEBUG] > Running: git rev-parse --show-toplevel
    2021-01-30 05:13:19,353 autosynth [DEBUG] > Running: git log -1 --pretty=%H
    2021-01-30 05:13:19,356 autosynth [DEBUG] > Running: git remote get-url origin
    2021-01-30 05:13:19,359 autosynth [DEBUG] > Running: git checkout e789e3d850b2cbd60816a26a33b3e538b60bdc3d
    Note: checking out 'e789e3d850b2cbd60816a26a33b3e538b60bdc3d'.
    
    You are in 'detached HEAD' state. You can look around, make experimental
    changes and commit them, and you can discard any commits you make in this
    state without impacting any branches by performing another checkout.
    
    If you want to create a new branch to retain commits you create, you may
    do so (now or later) by using -b with the checkout command again. Example:
    
      git checkout -b <new-branch-name>
    
    HEAD is now at e789e3d85 chore: start tracking obsolete files (#10531)
    2021-01-30 05:13:19,363 autosynth [DEBUG] > Running: git branch -f autosynth-0
    2021-01-30 05:13:19,366 autosynth [DEBUG] > Running: git checkout autosynth-0
    Switched to branch 'autosynth-0'
    2021-01-30 05:13:19,369 autosynth [INFO] > Running synthtool
    2021-01-30 05:13:19,370 autosynth [INFO] > ['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']
    2021-01-30 05:13:19,370 autosynth [DEBUG] > log_file_path: /tmpfs/src/logs/google-cloud-python/0/sponge_log.log
    2021-01-30 05:13:19,371 autosynth [DEBUG] > Running: /tmpfs/src/github/synthtool/env/bin/python3 -m synthtool --metadata synth.metadata synth.py --
    2021-01-30 05:13:19,590 synthtool [DEBUG] > Executing /home/kbuilder/.cache/synthtool/google-cloud-python/synth.py.
    On branch autosynth-0
    nothing to commit, working tree clean
    Traceback (most recent call last):
      File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
        "__main__", mod_spec)
      File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
        exec(code, run_globals)
      File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
        main()
      File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
        return self.main(*args, **kwargs)
      File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main
        rv = self.invoke(ctx)
      File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
        return ctx.invoke(self.callback, **ctx.params)
      File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
        return callback(*args, **kwargs)
      File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main
        spec.loader.exec_module(synth_module)  # type: ignore
      File "<frozen importlib._bootstrap_external>", line 678, in exec_module
      File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
      File "/home/kbuilder/.cache/synthtool/google-cloud-python/synth.py", line 149, in <module>
        clients = sorted(all_clients())
      File "/home/kbuilder/.cache/synthtool/google-cloud-python/synth.py", line 142, in all_clients
        for repo in response.json()["repos"]
      File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/requests/models.py", line 900, in json
        return complexjson.loads(self.text, **kwargs)
      File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/json/__init__.py", line 354, in loads
        return _default_decoder.decode(s)
      File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/json/decoder.py", line 342, in decode
        raise JSONDecodeError("Extra data", s, end)
    json.decoder.JSONDecodeError: Extra data: line 1 column 4 (char 3)
    2021-01-30 05:13:19,907 autosynth [ERROR] > Synthesis failed
    2021-01-30 05:13:19,907 autosynth [DEBUG] > Running: git reset --hard HEAD
    HEAD is now at e789e3d85 chore: start tracking obsolete files (#10531)
    2021-01-30 05:13:19,912 autosynth [DEBUG] > Running: git checkout autosynth
    Switched to branch 'autosynth'
    2021-01-30 05:13:19,917 autosynth [DEBUG] > Running: git clean -fdx
    Removing __pycache__/
    Traceback (most recent call last):
      File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
        "__main__", mod_spec)
      File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
        exec(code, run_globals)
      File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 354, in <module>
        main()
      File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 189, in main
        return _inner_main(temp_dir)
      File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 334, in _inner_main
        commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
      File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 65, in synthesize_loop
        has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest)
      File "/tmpfs/src/github/synthtool/autosynth/synth_toolbox.py", line 259, in synthesize_version_in_new_branch
        synthesizer.synthesize(synth_log_path, self.environ)
      File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
        synth_proc.check_returncode()  # Raise an exception.
      File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
        self.stderr)
    subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.
    
    

    Google internal developers can see the full log here.

    type: bug priority: p2 autosynth failure 
    opened by yoshi-automation 152
  • Support Cloud Client Libraries on Google App Engine standard

    Support Cloud Client Libraries on Google App Engine standard

    I'm able to deploy it but I keep getting this error: "DistributionNotFound: The 'gcloud' distribution was not found and is required by the application"

    type: bug packaging status: blocked grpc priority: p2 :rotating_light: 
    opened by gpopovic 130
  • Build out some integration with Cloud Bigtable

    Build out some integration with Cloud Bigtable

    We want to make users of Happybase "just work", so...some ideas:

    Option 1: A wrapped import module

    from gcloud.bigtable import happybase
    

    Option 2: Some sort of monkey patching?

    from gcloud import bigtable
    import happybase
    happybase = bigtable.monkey_patch(happybase)
    
    api: bigtable 
    opened by jgeewax 95
  • httplib2.Http is not thread-safe

    httplib2.Http is not thread-safe

    This is from a StackOverflow post. I've done some debugging from the Datastore side but I don't think these requests ever make it into the Datastore part of the stack. I'd appreciate if someone look at this from the gcloud-python team. Note that this issue was originally reported via [email protected] In July.

    I have a Python Django application running on a Google Compute instance. It is using gcloudoem to interface from Django to Google Datastore. gcloudoem uses the same underlying code to communicate with Datastore as gcloud-python 0.5.x

    At what seems to be completely random times, I will get SSL errors happening when trying to talk to Datastore. There is no pattern in where in my application code these happen. It's just during a random call to Datastore. Here are the two flavours of errors:

    ERROR:django.request:Internal Server Error: /complete/google-oauth2/
    Traceback (most recent call last):
      File "/usr/local/lib/python2.7/dist-packages/django/core/handlers/base.py", line 111, in get_response
        response = wrapped_callback(request, *callback_args, **callback_kwargs)
      File "/usr/local/lib/python2.7/dist-packages/django/views/decorators/cache.py", line 52, in _wrapped_view_func
        response = view_func(request, *args, **kwargs)
      File "/usr/local/lib/python2.7/dist-packages/django/views/decorators/csrf.py", line 57, in wrapped_view
        return view_func(*args, **kwargs)
      File "/usr/local/lib/python2.7/dist-packages/social/apps/django_app/utils.py", line 51, in wrapper
        return func(request, backend, *args, **kwargs)
      File "/usr/local/lib/python2.7/dist-packages/social/apps/django_app/views.py", line 28, in complete
        redirect_name=REDIRECT_FIELD_NAME, *args, **kwargs)
      File "/usr/local/lib/python2.7/dist-packages/social/actions.py", line 43, in do_complete
        user = backend.complete(user=user, *args, **kwargs)
      File "/usr/local/lib/python2.7/dist-packages/social/backends/base.py", line 41, in complete
        return self.auth_complete(*args, **kwargs)
      File "/usr/local/lib/python2.7/dist-packages/social/utils.py", line 229, in wrapper
        return func(*args, **kwargs)
      File "/usr/local/lib/python2.7/dist-packages/social/backends/oauth.py", line 387, in auth_complete
        *args, **kwargs)
      File "/usr/local/lib/python2.7/dist-packages/social/utils.py", line 229, in wrapper
        return func(*args, **kwargs)
      File "/usr/local/lib/python2.7/dist-packages/social/backends/oauth.py", line 396, in do_auth
        return self.strategy.authenticate(*args, **kwargs)
      File "/usr/local/lib/python2.7/dist-packages/social/strategies/django_strategy.py", line 96, in authenticate
        return authenticate(*args, **kwargs)
      File "/usr/local/lib/python2.7/dist-packages/django/contrib/auth/__init__.py", line 60, in authenticate
        user = backend.authenticate(**credentials)
      File "/usr/local/lib/python2.7/dist-packages/social/backends/base.py", line 82, in authenticate
        return self.pipeline(pipeline, *args, **kwargs)
      File "/usr/local/lib/python2.7/dist-packages/social/backends/base.py", line 85, in pipeline
        out = self.run_pipeline(pipeline, pipeline_index, *args, **kwargs)
      File "/usr/local/lib/python2.7/dist-packages/social/backends/base.py", line 112, in run_pipeline
        result = func(*args, **out) or {}
      File "/usr/local/lib/python2.7/dist-packages/social/pipeline/social_auth.py", line 20, in social_user
        social = backend.strategy.storage.user.get_social_auth(provider, uid)
      File "./social_gc/storage.py", line 105, in get_social_auth
        return cls.objects.get(provider=provider, uid=uid)
      File "/usr/local/lib/python2.7/dist-packages/gcloudoem/queryset/__init__.py", line 162, in get
        num = len(clone)
      File "/usr/local/lib/python2.7/dist-packages/gcloudoem/queryset/__init__.py", line 126, in __len__
        self._fetch_all()
      File "/usr/local/lib/python2.7/dist-packages/gcloudoem/queryset/__init__.py", line 370, in _fetch_all
        self._result_cache = list(self.iterator())
      File "/usr/local/lib/python2.7/dist-packages/gcloudoem/datastore/query.py", line 480, in __iter__
        self.next_page()
      File "/usr/local/lib/python2.7/dist-packages/gcloudoem/datastore/query.py", line 452, in next_page
        transaction_id=transaction and transaction.id,
      File "/usr/local/lib/python2.7/dist-packages/gcloudoem/datastore/connection.py", line 249, in run_query
        response = self._rpc('runQuery', request, datastore_pb.RunQueryResponse)
      File "/usr/local/lib/python2.7/dist-packages/gcloudoem/datastore/connection.py", line 159, in _rpc
        data=request_pb.SerializeToString()
      File "/usr/local/lib/python2.7/dist-packages/gcloudoem/datastore/connection.py", line 134, in _request
        body=data
      File "/usr/local/lib/python2.7/dist-packages/oauth2client/client.py", line 589, in new_request
        redirections, connection_type)
      File "/usr/local/lib/python2.7/dist-packages/httplib2/__init__.py", line 1609, in request
        (response, content) = self._request(conn, authority, uri, request_uri, method, body, headers, redirections, cachekey)
      File "/usr/local/lib/python2.7/dist-packages/httplib2/__init__.py", line 1351, in _request
        (response, content) = self._conn_request(conn, request_uri, method, body, headers)
      File "/usr/local/lib/python2.7/dist-packages/httplib2/__init__.py", line 1307, in _conn_request
        response = conn.getresponse()
      File "/usr/lib/python2.7/httplib.py", line 1127, in getresponse
        response.begin()
      File "/usr/lib/python2.7/httplib.py", line 453, in begin
        version, status, reason = self._read_status()
      File "/usr/lib/python2.7/httplib.py", line 409, in _read_status
        line = self.fp.readline(_MAXLINE + 1)
      File "/usr/lib/python2.7/socket.py", line 480, in readline
        data = self._sock.recv(self._rbufsize)
      File "/usr/lib/python2.7/ssl.py", line 734, in recv
        return self.read(buflen)
      File "/usr/lib/python2.7/ssl.py", line 621, in read
        v = self._sslobj.read(len or 1024)
    SSLError: [SSL: WRONG_VERSION_NUMBER] wrong version number (_ssl.c:1752)
    

    Unfortunately, for the second, I don't have a full stacktrace handy:

    [SSL: DECRYPTION_FAILED_OR_BAD_RECORD_MAC] decryption failed or bad record mac (_ssl.c:1752)
    

    These errors don't happen when I am using the GCD tool. Does anyone have any idea what is happening here? Is this some sort of networking problem?

    api: core 
    opened by pcostell 83
  • Can't import google.protobuf.timestamp_pb2 from App Engine

    Can't import google.protobuf.timestamp_pb2 from App Engine

    I'm getting ImportError: No module named protobuf from the line from google.protobuf import timestamp_pb2 because the google module refers to the AppEngine SDK.

    This is a well-known issue that's been around for years. Maybe you can bundle protobuf directly inside gcloud-python instead of an external dependency?

    packaging 
    opened by barakcoh 77
  • [Discussion] HTTP library

    [Discussion] HTTP library

    Presently, we use httplib2 by default but allow users to specify their own http library (#908) so long as it conforms to httplib2.Http's signature.

    httplib2 was chosen because it's the underlying http client used by google/oauth2client. However, httplib2 has a variety of issues include not being thread-safe (#1274), not doing any connection pooling, etc.

    We should consider what it would take to move to another http library. The major considerations are:

    1. We must support using oauth2client's credentials with the library. The essential functionality is adding the auth header and performing refresh & retry. This can be done either here or within oauth2client.
    2. The library must work on Google App Engine.
    api: core 
    opened by theacodes 62
  • Frequent gRPC StatusCode.UNAVAILABLE errors

    Frequent gRPC StatusCode.UNAVAILABLE errors

    Using the current codebase from master branch (e1fbb6b), with GRPC, we sometimes (0.5% of requests, approximately) see the following exception:

     AbortionError(code=StatusCode.UNAVAILABLE, details="{"created":"@1478255129.468798425","description":"Secure read failed","file":"src/core/lib/security/transport/secure_endpoint.c","file_line":157,"grpc_status":14,"referenced_errors":[{"created":"@1478255129.468756939","description":"EOF","file":"src/core/lib/iomgr/tcp_posix.c","file_line":235}]}"))
    

    Retrying this seem to always succeed.

    Should application code have to care about this kind of error and retry? Or is this a bug in google-cloud-pubsub code?

    Package versions installed:

    gapic-google-logging-v2==0.10.1
    gapic-google-pubsub-v1==0.10.1
    google-api-python-client==1.5.4
    google-cloud==0.20.0
    google-cloud-bigquery==0.20.0
    google-cloud-bigtable==0.20.0
    google-cloud-core==0.20.0
    google-cloud-datastore==0.20.1
    google-cloud-dns==0.20.0
    google-cloud-error-reporting==0.20.0
    google-cloud-language==0.20.0
    google-cloud-logging==0.20.0
    google-cloud-monitoring==0.20.0
    google-cloud-pubsub==0.20.0
    google-cloud-resource-manager==0.20.0
    google-cloud-storage==0.20.0
    google-cloud-translate==0.20.0
    google-cloud-vision==0.20.0
    google-gax==0.14.1
    googleapis-common-protos==1.3.5
    grpc-google-iam-v1==0.10.1
    grpc-google-logging-v2==0.10.1
    grpc-google-pubsub-v1==0.10.1
    grpcio==1.0.0
    

    Note: Everything google-cloud* comes from git master.

    This is on Python 2.7.3

    Traceback:

      File "ospdatasubmit/pubsub.py", line 308, in _flush
        publish_response = self.pubsub_client.Publish(publish_request, self._publish_timeout)
      File "grpc/beta/_client_adaptations.py", line 305, in __call__
        self._request_serializer, self._response_deserializer)
      File "grpc/beta/_client_adaptations.py", line 203, in _blocking_unary_unary
        raise _abortion_error(rpc_error_call)
    
    type: bug grpc priority: p2 
    opened by forsberg 61
  • Explore going the route of using Clients ?

    Explore going the route of using Clients ?

    After our last talk, there have been quite a few different ideas tossed around to make it clear and obvious which credentials and project IDs are in use during a particular API call, some of those have been....

    1. Changing the default values globally
    2. Making "Connections" a context manager (with connection: # do something)
    3. Creating the concept of a client

    We do (1), are talking about doing (2), while the others tend to do (3) -- and comparing the code, I think (3) is the nicest.

    gcloud-node:

    var gcloud = require('gcloud');
    var project1_storage = gcloud.storage({projectId: 'project1', keyFilename: '/path/to/key'});
    var project2_storage_auto = gcloud.storage();
    
    # Do things with the two "clients"...
    bucket1 = project1_storage.get_bucket('bucket-1')
    bucket2 = project2_storage_auto.get_bucket('bucket2')
    

    gcloud-ruby

    require 'gcloud/storage'
    project1_storage = Gcloud.storage "project-id-1" "/path/to/key"
    project2_storage_auto = Gcloud.storage  # Magically figure out the project ID and credentials
    
    # Do things with the two "clients"...
    bucket1 = project1_storage.find_bucket "bucket-1"
    bucket2 = project2_storage_auto.find_bucket "bucket-2"
    

    gcloud-python

    from gcloud import storage
    from gcloud.credentials import get_for_service_account_json
    
    # Create two different credentials.
    credentials1 = get_for_service_account_json('key1.json')
    credentials2 = get_for_service_account_json('key2.json')
    
    # Create two different connections.
    connection1 = storage.Connection(credentials=credentials1)
    connection2 = storage.Connection(credentials=credentials2)
    
    # Get two different buckets
    bucket1 = storage.get_bucket('bucket-1', project='project1', connection=connection1)
    bucket2 = storage.get_bucket('bucket-2', project='project2', connection=connection2)
    

    gcloud-python if we followed the client pattern:

    from gcloud import storage
    
    project1_storage = storage.Client('project1', '/path/to/key')
    project2_storage_auto = storage.Client()
    
    # Do things with the two "clients"...
    bucket1 = project1_storage.get_bucket('bucket-1')
    bucket2 = project2_storage_auto.get_bucket('bucket-2')
    

    Another option for gcloud-python using the client pattern:

    import gcloud
    
    project1_storage = gcloud.storage('project1', '/path/to/key')
    project2_storage_auto = gcloud.storage()
    
    # Do things with the two "clients"
    bucket1 = project1_storage.get_bucket('bucket-1')
    bucket2 = project2_storage_auto.get_bucket('bucket-2')
    

    /cc @dhermes @tseaver

    type: question api: core auth 
    opened by jgeewax 60
  • Pub/Sub Subscriber does not catch & retry UNAVAILABLE errors

    Pub/Sub Subscriber does not catch & retry UNAVAILABLE errors

    A basic Pub/Sub message consumer stops consuming messages after a retryable error (see stack trace below, but in short _Rendezvous: <_Rendezvous of RPC that terminated with (StatusCode.UNAVAILABLE, The service was unable to fulfill your request. Please try again. [code=8a75])>). The app does not crash but the stream never recovers and continue to receive messages. Interesting observations;

    • If I simply turn off WiFi on my laptop and run the same code, it keeps retrying until the machine is connected to the network and functions as expected. This tells me that this is a reaction to the specific StatusCode
    • The exception sometimes happens on startup sometimes mid-stream.

    Expected behavior:

    • The application code would continue retrying to build the streamingPull connection and eventually recover and receive messages.
    • This would be handled and surfaced as a warning, rather than a thread-killing exception.

    This might be the same issue as 2683. This comment, in particular, seems like the solution that I would expect the client library to implement.

    Answers to standard questions:

    1. OS type and version MacOS Sierra 10.12.6
    2. Python version and virtual environment information python --version Python 2.7.10 (running in virtualenv)
    3. google-cloud-python version pip show google-cloud, pip show google-<service> or pip freeze
    $ pip show google-cloud
    Name: google-cloud
    Version: 0.27.0
    pip show google-cloud-pubsub
    Name: google-cloud-pubsub
    Version: 0.28.4
    
    1. Stacktrace if available
    Exception in thread Consumer helper: consume bidirectional stream:
    Traceback (most recent call last):
      File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/threading.py", line 810, in __bootstrap_inner
        self.run()
      File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/threading.py", line 763, in run
        self.__target(*self.__args, **self.__kwargs)
      File "/Users/kir/cloud/env/lib/python2.7/site-packages/google/cloud/pubsub_v1/subscriber/_consumer.py", line 248, in _blocking_consume
        self._policy.on_exception(exc)
      File "/Users/kir/cloud/env/lib/python2.7/site-packages/google/cloud/pubsub_v1/subscriber/policy/thread.py", line 140, in on_exception
        raise exception
    _Rendezvous: <_Rendezvous of RPC that terminated with (StatusCode.UNAVAILABLE, The service was unable to fulfill your request. Please try again. [code=8a75])>
    
    1. Steps to reproduce
    • I was not able to reproduce this consistently. But it would happen ~1 in 10 times I ran the code.
    1. Code example
    import time, datetime, sys
    from google.cloud import pubsub_v1 as pubsub
    
    subscription_name = "projects/%s/subscriptions/%s"%(sys.argv[1], sys.argv[2])
    sleep_time_ms = 0
    try:
        sleep_time_ms = int(sys.argv[3])
    except Exception:
        print "Could not parse custom sleep time."
    print "Using sleep time %g ms"%sleep_time_ms
    
    def callback(message):
        t = time.time()
        time.sleep(float(sleep_time_ms)/1000)
        print "Message " + message.data + " acked in %g second"%(time.time() - t)
        message.ack()
    
    subscription = pubsub.SubscriberClient().subscribe(subscription_name).open(callback=callback)
    time.sleep(10000)
    
    type: bug api: pubsub priority: p1 
    opened by kir-titievsky 59
  • AttributeError: '_NamespacePath' object has no attribute 'sort'

    AttributeError: '_NamespacePath' object has no attribute 'sort'

    1. Ubuntu 16.04.1 LTS
    2. Python 3.5.2 :: Anaconda 4.2.0 (64-bit)

    I was trying install with pip install --upgrade google-cloud but I get the following AttributeError: '_NamespacePath' object has no attribute 'sort'

    Below is the stack trace:

    Traceback (most recent call last):
      File "/home/ubuntu/anaconda3/bin/pip", line 7, in <module>
        from pip import main
      File "/home/ubuntu/anaconda3/lib/python3.5/site-packages/pip/__init__.py", line 26, in <module>
        from pip.utils import get_installed_distributions, get_prog
      File "/home/ubuntu/anaconda3/lib/python3.5/site-packages/pip/utils/__init__.py", line 27, in <module>
        from pip._vendor import pkg_resources
      File "/home/ubuntu/anaconda3/lib/python3.5/site-packages/pip/_vendor/pkg_resources/__init__.py", line 3018, in <module>
        @_call_aside
      File "/home/ubuntu/anaconda3/lib/python3.5/site-packages/pip/_vendor/pkg_resources/__init__.py", line 3004, in _call_aside
        f(*args, **kwargs)
      File "/home/ubuntu/anaconda3/lib/python3.5/site-packages/pip/_vendor/pkg_resources/__init__.py", line 3046, in _initialize_master_working_set
        dist.activate(replace=False)
      File "/home/ubuntu/anaconda3/lib/python3.5/site-packages/pip/_vendor/pkg_resources/__init__.py", line 2578, in activate
        declare_namespace(pkg)
      File "/home/ubuntu/anaconda3/lib/python3.5/site-packages/pip/_vendor/pkg_resources/__init__.py", line 2152, in declare_namespace
        _handle_ns(packageName, path_item)
      File "/home/ubuntu/anaconda3/lib/python3.5/site-packages/pip/_vendor/pkg_resources/__init__.py", line 2092, in _handle_ns
        _rebuild_mod_path(path, packageName, module)
      File "/home/ubuntu/anaconda3/lib/python3.5/site-packages/pip/_vendor/pkg_resources/__init__.py", line 2121, in _rebuild_mod_path
        orig_path.sort(key=position_in_sys_path)
    AttributeError: '_NamespacePath' object has no attribute 'sort'
    
    packaging 
    opened by Rockyyost 53
  • Synthesis failed for cloudbuild

    Synthesis failed for cloudbuild

    Hello! Autosynth couldn't regenerate cloudbuild. :broken_heart:

    Here's the output from running synth.py:

    Cloning into 'working_repo'...
    Switched to branch 'autosynth-cloudbuild'
    Running synthtool
    ['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
    synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
    synthtool > Ensuring dependencies.
    synthtool > Pulling artman image.
    latest: Pulling from googleapis/artman
    Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
    Status: Image is up to date for googleapis/artman:latest
    synthtool > Cloning googleapis.
    synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
    synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
    synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
    synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
    Traceback (most recent call last):
      File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
        "__main__", mod_spec)
      File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
        exec(code, run_globals)
      File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
        main()
      File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
        return self.main(*args, **kwargs)
      File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
        rv = self.invoke(ctx)
      File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
        return ctx.invoke(self.callback, **ctx.params)
      File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
        return callback(*args, **kwargs)
      File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
        spec.loader.exec_module(synth_module)  # type: ignore
      File "<frozen importlib._bootstrap_external>", line 678, in exec_module
      File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
      File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
        'README.rst'
      File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
        _tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
      File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
        _tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
      File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
        for p in root.glob(path)
      File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
        p
      File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
        selector = _make_selector(tuple(pattern_parts))
      File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
        raise ValueError("Invalid pattern: '**' can only be an entire path component")
    ValueError: Invalid pattern: '**' can only be an entire path component
    synthtool > Cleaned up 1 temporary directories.
    synthtool > Wrote metadata to synth.metadata.
    
    Synthesis failed
    
    

    Google internal developers can see the full log here.

    type: bug priority: p1 :rotating_light: autosynth failure 
    opened by yoshi-automation 51
  • chore(deps): update all dependencies

    chore(deps): update all dependencies

    Mend Renovate

    This PR contains the following updates:

    | Package | Change | Age | Adoption | Passing | Confidence | Type | Update | |---|---|---|---|---|---|---|---| | google-api-core | >= 1.34.0, <3.0.0dev,!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,!=2.10.* -> >=2.11, <3.0.0dev, !=2.0, !=2.1, !=2.2, !=2.3, !=2.4, !=2.5, !=2.6, !=2.7, !=2.8, !=2.9, !=2.10 | age | adoption | passing | confidence | | major | | python | 3.11.0-buster -> 3.11.1-buster | age | adoption | passing | confidence | final | patch |


    Release Notes

    googleapis/python-api-core

    v2.11.0

    Compare Source

    Features
    Bug Fixes

    v2.10.2

    Compare Source

    Bug Fixes

    v2.10.1

    Compare Source

    Bug Fixes

    v2.10.0

    Compare Source

    Features

    v2.9.0

    Compare Source

    Features
    Bug Fixes

    v2.8.2

    Compare Source

    Bug Fixes
    Documentation

    v2.8.1

    Compare Source

    Bug Fixes

    v2.8.0

    Compare Source

    Features

    v2.7.3

    Compare Source

    Bug Fixes

    v2.7.2

    Compare Source

    Bug Fixes

    v2.7.1

    Compare Source

    Bug Fixes

    v2.7.0

    Compare Source

    Features

    v2.6.1

    Compare Source

    Bug Fixes

    v2.6.0

    Compare Source

    Features

    v2.5.0

    Compare Source

    Features
    Bug Fixes
    Documentation

    v2.4.0

    Compare Source

    Features

    v2.3.2

    Compare Source

    Bug Fixes
    • address broken wheels in version 2.3.1

    v2.3.1

    Compare Source

    Bug Fixes
    • exclude function target from retry deadline exceeded exception message (#​318) (34ebdcc)

    v2.3.0

    Compare Source

    Features
    Bug Fixes

    v2.2.2

    Compare Source

    Bug Fixes

    v2.2.1

    Compare Source

    Bug Fixes

    v2.2.0

    Compare Source

    Features

    v2.1.1

    Compare Source

    Bug Fixes

    v2.1.0

    Compare Source

    Features

    v2.0.1

    Compare Source

    Bug Fixes

    v2.0.0

    Compare Source

    âš  BREAKING CHANGES
    Bug Fixes

    Configuration

    📅 Schedule: Branch creation - "before 3am on Monday" (UTC), Automerge - At any time (no schedule defined).

    🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.

    â™» Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

    👻 Immortal: This PR will be recreated if closed unmerged. Get config help if that's undesired.


    • [ ] If you want to rebase/retry this PR, check this box

    This PR has been generated by Mend Renovate. View repository job log here.

    kokoro:force-run 
    opened by renovate-bot 0
  • use Search repositories endpoint

    use Search repositories endpoint

    opened by av1m 0
  • support SQL admin APIs

    support SQL admin APIs

    Hi,

    We want to use the sqladmin APIs as described here. But it is only supported by the deprecated client, and they strongly recommend using this client. Are there any plans on adding those APIs?

    Thanks

    opened by ohadgranica 0
  • Too many libraries with too similar names

    Too many libraries with too similar names

    PLEASE READ: If you have a support contract with Google, please create an issue in the support console instead of filing on GitHub. This will ensure a timely response.

    NOTE: Google Cloud Python client libraries are no longer maintained inside this repository. Please visit the python-API repository (e.g., https://github.com/googleapis/python-pubsub/issues) for faster response times.

    See all published libraries in the README.

    I only figured this out while rummaging around in site-packages after a ModuleNotFoundError on the very first try . You need a better naming convention. More prominently displayed version numbers side by side with effective and deprecation dates would save hours and hours of dev time trying to parse which python / cloud / google / api is the right one. - time that is not spent making apps. Don't use those 4 words anymore. Give them clearly distinguished or unique version names, as Ubuntu and Android have done for years.

    https://github.com/googleapis/google-api-python-client https://github.com/googleapis/google-cloud-python https://developers.google.com/drive/api/quickstart/python

    opened by MalikRumi 0
  • Update README to include package names, currently on the `pretty name` is included in the list

    Update README to include package names, currently on the `pretty name` is included in the list

    The title should include both the package name and the pretty name

    https://github.com/googleapis/google-cloud-python/blob/c2da566a711b7b354346c901c49a83f1c49f570f/scripts/updateapilist.py#L36

    opened by parthea 0
  • stability level nomenclature inconsistent in README

    stability level nomenclature inconsistent in README

    In the README.rst file the text talks about "GA", "alpha", and "beta", but the table has "stable" and "preview". Let's be consistent.

    priority: p2 type: docs 
    opened by vchudnov-g 0
Releases(google-maps-routing-v0.1.2)
  • google-maps-routing-v0.1.2(Dec 14, 2022)

  • google-maps-routing-v0.1.1(Dec 6, 2022)

  • google-maps-addressvalidation-v0.1.1(Dec 6, 2022)

  • google-geo-type-v0.2.1(Dec 6, 2022)

  • google-cloud-vmwareengine-v0.1.1(Dec 6, 2022)

  • google-cloud-enterpriseknowledgegraph-v0.2.1(Dec 6, 2022)

  • google-cloud-discoveryengine-v0.2.1(Dec 6, 2022)

  • google-cloud-contentwarehouse-v0.2.1(Dec 6, 2022)

  • google-apps-script-type-v0.2.1(Dec 6, 2022)

  • google-cloud-vmwareengine-v0.1.0(Nov 18, 2022)

  • google-apps-script-type-v0.2.0(Nov 14, 2022)

  • google-maps-routing-v0.1.0(Nov 10, 2022)

  • google-maps-addressvalidation-v0.1.0(Nov 10, 2022)

  • google-geo-type-v0.2.0(Nov 10, 2022)

  • google-cloud-enterpriseknowledgegraph-v0.2.0(Nov 10, 2022)

  • google-cloud-discoveryengine-v0.2.0(Nov 10, 2022)

  • google-cloud-contentwarehouse-v0.2.0(Nov 10, 2022)

  • google-apps-script-type-v0.1.0(Nov 10, 2022)

  • google-geo-type-v0.1.0(Nov 9, 2022)

  • google-cloud-contentwarehouse-v0.1.0(Nov 9, 2022)

  • google-cloud-enterpriseknowledgegraph-v0.1.0(Nov 2, 2022)

  • google-cloud-discoveryengine-v0.1.1(Nov 2, 2022)

  • bigquery-1.24.0(Feb 3, 2020)

    02-03-2020 01:38 PST

    Implementation Changes

    • Fix inserting missing repeated fields. (#10196)
    • Deprecate client.dataset() in favor of DatasetReference. (#7753)
    • Use faster to_arrow + to_pandas in to_dataframe() when pyarrow is available. (#10027)
    • Write pandas datetime[ns] columns to BigQuery TIMESTAMP columns. (#10028)

    New Features

    • Check rows argument type in insert_rows(). (#10174)
    • Check json_rows arg type in insert_rows_json(). (#10162)
    • Make RowIterator.to_dataframe_iterable() method public. (#10017)
    • Add retry parameter to public methods where missing. (#10026)
    • Add timeout parameter to Client and Job public methods. (#10002)
    • Add timeout parameter to QueryJob.done() method. (#9875)
    • Add create_bqstorage_client parameter to to_dataframe() and to_arrow() methods. (#9573)

    Dependencies

    • Fix minimum versions of google-cloud-core and google-resumable-media dependencies. (#10016)

    Documentation

    • Fix a comment typo in job.py. (#10209)
    • Update code samples of load table file and load table URI. (#10175)
    • Uncomment Client constructor and imports in samples. (#10058)
    • Remove unused query code sample. (#10024)
    • Update code samples to use strings for table and dataset IDs. (#9974)

    Internal / Testing Changes

    • Bump copyright year to 2020, tweak docstring formatting (via synth). #10225
    • Add tests for concatenating categorical columns. (#10180)
    • Adjust test assertions to the new default timeout. (#10222)
    • Use Python 3.6 for the nox blacken session (via synth). (#10012)
    Source code(tar.gz)
    Source code(zip)
  • core-1.3.0(Jan 31, 2020)

  • asset-0.7.0(Jan 29, 2020)

  • recommender-0.2.0(Jan 24, 2020)

    01-24-2020 14:03 PST

    Implementation Changes

    • Deprecate resource name helper methods (via synth). (#9863)

    New Features

    • Add v1, set release level to beta. (#10170)

    Documentation

    • Add Python 2 sunset banner to documentation. (#9036)
    • Change requests intersphinx url (via synth). (#9408)
    • Fix library reference doc link. (#9338)

    Internal / Testing Changes

    • Correct config path in synth file for recommender. (#10076)
    Source code(tar.gz)
    Source code(zip)
  • storage-1.25.0(Jan 16, 2020)

    01-16-2020 11:00 PST

    Implementation Changes

    • fix: replace unsafe six.PY3 with PY2 for better future compatibility with Python 4 (#10081)
    • fix(storage): fix document of delete blob (#10015)

    New Features

    • feat(storage): support optionsRequestedPolicyVersion (#9989)

    Dependencies

    • chore(storage): bump core dependency to 1.2.0 (#10160)
    Source code(tar.gz)
    Source code(zip)
  • core-1.2.0(Jan 15, 2020)

  • api_core-1.16.0(Jan 14, 2020)

  • bigtable-1.2.1(Jan 3, 2020)

Owner
Google APIs
Clients for Google APIs and tools that help produce them.
Google APIs
Official Python low-level client for Elasticsearch

Python Elasticsearch Client Official low-level client for Elasticsearch. Its goal is to provide common ground for all Elasticsearch-related code in Py

elastic 3.8k Jan 01, 2023
A tiny python web application based on Flask to set, get, expire, delete keys of Redis database easily with direct link at the browser.

First Redis Python (CRUD) A tiny python web application based on Flask to set, get, expire, delete keys of Redis database easily with direct link at t

Max Base 9 Dec 24, 2022
Confluent's Kafka Python Client

Confluent's Python Client for Apache KafkaTM confluent-kafka-python provides a high-level Producer, Consumer and AdminClient compatible with all Apach

Confluent Inc. 3.1k Jan 05, 2023
Estoult - a Python toolkit for data mapping with an integrated query builder for SQL databases

Estoult Estoult is a Python toolkit for data mapping with an integrated query builder for SQL databases. It currently supports MySQL, PostgreSQL, and

halcyon[nouveau] 15 Dec 29, 2022
Pure Python MySQL Client

PyMySQL Table of Contents Requirements Installation Documentation Example Resources License This package contains a pure-Python MySQL client library,

PyMySQL 7.2k Jan 09, 2023
MySQL Operator for Kubernetes

MySQL Operator for Kubernetes The MYSQL Operator for Kubernetes is an Operator for Kubernetes managing MySQL InnoDB Cluster setups inside a Kubernetes

MySQL 462 Dec 24, 2022
A CRUD and REST api with mongodb atlas.

Movies_api A CRUD and REST api with mongodb atlas. Setup First import all the python dependencies in your virtual environment or globally by the follo

Pratyush Kongalla 0 Nov 09, 2022
This repository is for active development of the Azure SDK for Python.

Azure SDK for Python This repository is for active development of the Azure SDK for Python. For consumers of the SDK we recommend visiting our public

Microsoft Azure 3.4k Jan 02, 2023
A selection of SQLite3 databases to practice querying from.

Dummy SQL Databases This is a collection of dummy SQLite3 databases, for learning and practicing SQL querying, generated with the VS Code extension Ge

1 Feb 26, 2022
An asyncio compatible Redis driver, written purely in Python. This is really just a pet-project for me.

asyncredis An asyncio compatible Redis driver. Just a pet-project. Information asyncredis is, like I've said above, just a pet-project for me. I reall

Vish M 1 Dec 25, 2021
A simple Python tool to transfer data from MySQL to SQLite 3.

MySQL to SQLite3 A simple Python tool to transfer data from MySQL to SQLite 3. This is the long overdue complimentary tool to my SQLite3 to MySQL. It

Klemen Tusar 126 Jan 03, 2023
dask-sql is a distributed SQL query engine in python using Dask

dask-sql is a distributed SQL query engine in Python. It allows you to query and transform your data using a mixture of common SQL operations and Python code and also scale up the calculation easily

Nils Braun 271 Dec 30, 2022
Asynchronous interface for peewee ORM powered by asyncio

peewee-async Asynchronous interface for peewee ORM powered by asyncio. Important notes Since version 0.6.0a only peewee 3.5+ is supported If you still

05Bit 666 Dec 30, 2022
Apache Libcloud is a Python library which hides differences between different cloud provider APIs and allows you to manage different cloud resources through a unified and easy to use API

Apache Libcloud - a unified interface for the cloud Apache Libcloud is a Python library which hides differences between different cloud provider APIs

The Apache Software Foundation 1.9k Dec 25, 2022
Implementing basic MySQL CRUD (Create, Read, Update, Delete) queries, using Python.

MySQL with Python Implementing basic MySQL CRUD (Create, Read, Update, Delete) queries, using Python. We can connect to a MySQL database hosted locall

MousamSingh 5 Dec 01, 2021
aiosql - Simple SQL in Python

aiosql - Simple SQL in Python SQL is code. Write it, version control it, comment it, and run it using files. Writing your SQL code in Python programs

Will Vaughn 1.1k Jan 08, 2023
aiopg is a library for accessing a PostgreSQL database from the asyncio

aiopg aiopg is a library for accessing a PostgreSQL database from the asyncio (PEP-3156/tulip) framework. It wraps asynchronous features of the Psycop

aio-libs 1.3k Jan 03, 2023
Monty, Mongo tinified. MongoDB implemented in Python !

Monty, Mongo tinified. MongoDB implemented in Python ! Inspired by TinyDB and it's extension TinyMongo. MontyDB is: A tiny version of MongoDB, against

David Lai 522 Jan 01, 2023
Lazydata: Scalable data dependencies for Python projects

lazydata: scalable data dependencies lazydata is a minimalist library for including data dependencies into Python projects. Problem: Keeping all data

629 Nov 21, 2022
edaSQL is a library to link SQL to Exploratory Data Analysis and further more in the Data Engineering.

edaSQL is a python library to bridge the SQL with Exploratory Data Analysis where you can connect to the Database and insert the queries. The query results can be passed to the EDA tool which can giv

Tamil Selvan 8 Dec 12, 2022