Metadata-Version: 2.1
Name: py2k
Version: 1.8.2
Summary: High level Python API for writing to Kafka
Home-page: https://github.com/AbsaOSS/py2k.git
Author: Daniel Wertheimer
Author-email: daniel.wertheimer@absa.africa
License: Apache Software license
Description: # Welcome to Py2k
        
        [![Tests](https://github.com/AbsaOSS/py2k/actions/workflows/ci.yml/badge.svg)](https://github.com/AbsaOSS/py2k/actions/workflows/ci.yml)
        [![codecov](https://codecov.io/gh/AbsaOSS/py2k/branch/main/graph/badge.svg?token=ICP840115H)](https://codecov.io/gh/AbsaOSS/py2k)
        [![pypi](https://img.shields.io/pypi/v/py2k.svg)](https://pypi.python.org/pypi/py2k)
        [![downloads](https://img.shields.io/pypi/dm/py2k.svg)](https://pypistats.org/packages/py2k)
        [![versions](https://img.shields.io/pypi/pyversions/py2k.svg)](https://github.com/AbsaOSS/py2k)
        [![license](https://img.shields.io/github/license/AbsaOSS/py2k.svg)](https://github.com/AbsaOSS/py2k/blob/main/LICENSE)
        
        A high level Python to Kafka API with Schema Registry compatibility and automatic avro schema creation.
        
        - Free software: Apache2 license
        
        ## Installation
        
        Py2K is currently available on PIP:
        
        ```bash
        pip install py2k
        ```
        
        ## Contributing
        
        Please see the [Contribution Guide](.github/CONTRIBUTING.md) for more information.
        
        ## Usage
        
        ### Minimal Example
        
        ```python
        from py2k.record import PandasToRecordsTransformer
        from py2k.writer import KafkaWriter
        
        # assuming we have a pandas DataFrame, df
        records = PandasToRecordsTransformer(df=df, record_name='test_model').from_pandas()
        
        writer = KafkaWriter(
            topic="topic_name",
            schema_registry_config=schema_registry_config,
            producer_config=producer_config
        )
        
        writer.write(records)
        ```
        
        For additional examples please see the [examples](./examples) folder
        
        ## Features
        
        - Schema Registry Integration
        - Automatic Avro Serialization from pandas DataFrames
        - Automatic Avro Schema generation from pandas DataFrames and Pydantic objects
        
        ## License
        
            Copyright 2021 ABSA Group Limited
        
            Licensed under the Apache License, Version 2.0 (the "License");
            you may not use this file except in compliance with the License.
            You may obtain a copy of the License at
        
                http://www.apache.org/licenses/LICENSE-2.0
        
            Unless required by applicable law or agreed to in writing, software
            distributed under the License is distributed on an "AS IS" BASIS,
            WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
            See the License for the specific language governing permissions and
            limitations under the License.
        
        
        # Release Notes
        
        ## v1.8.2 (2021-04-06)
        
        ### Bugs
        
        - Resolved boolean schema not being converted to the correct avro schema values PR [#48](https://github.com/AbsaOSS/py2k/pull/48) - [@vesely-david](https://github.com/vesely-david)
        
        ## v1.8.1 (2021-03-31)
        
        ### Docs
        
        - Added examples and solved mkdocs gitub.io page build - [@DanWertheimer](https://github.com/DanWertheimer). PR [#45](https://github.com/AbsaOSS/py2k/pull/45)
        
        ## v1.8.0 (2021-03-29)
        
        ### Fixes
        
        - Adhering to Kafka and Avro parlance by renaming:
          - models module -> record
          - KafkaModel -> KafkaRecord
          - DynamicPandasModel -> PandasToRecordsTransformer
          - item -> record
        - Move schema knowledge to KafkaRecord
        - Introduce `__key_fields__` in KafkaRecord to enable specifying which fields are part of the key
        - Introduce `__include_key__` in KafkaRecord to enable specifying whether key_fields should be part of the value message
        
        Big thank you to [@vesely-david](https://github.com/vesely-david) for this change
        
        ## v1.7.0 (2021-03-11)
        
        - Minor API change for easier dynamic creation of KafkaModels from a pandas DataFrame
        
        ## v1.6.0 (2021-03-01)
        
        - First commit on Github.
        
Keywords: py2k
Platform: UNKNOWN
Classifier: Development Status :: 5 - Production/Stable
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Natural Language :: English
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.6
Classifier: Programming Language :: Python :: 3.7
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Requires-Python: >=3.6
Description-Content-Type: text/markdown
