Metadata-Version: 2.1
Name: tfrecords
Version: 0.0.8
Summary: tfrecords: simplify and transplant the tfrecords
Home-page: https://github.com/ssbuild/tfrecords
Author: ssbuild
Author-email: 9727464@qq.com
License: Apache 2.0
Keywords: tfrecords,tfrecords,tfrecord,records,datasets
Platform: win32_AMD64
Classifier: Development Status :: 5 - Production/Stable
Classifier: Intended Audience :: Developers
Classifier: Intended Audience :: Education
Classifier: Intended Audience :: Science/Research
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Programming Language :: C++
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.6
Classifier: Programming Language :: Python :: 3.7
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Topic :: Scientific/Engineering
Classifier: Topic :: Scientific/Engineering :: Mathematics
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Classifier: Topic :: Software Development
Classifier: Topic :: Software Development :: Libraries
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Requires-Python: >=3, <4
Description-Content-Type: text/markdown

tfrecords
## simplify and transplant the tfrecord dataset

### update information
```text
    2022-10-17:  Add shared memory read mode with more accelerated Reading.
    2022-02-01:  simplify and transplant the tfrecord dataset
```

### record read and write demo , with_share_memory flags will Accelerated Reading
### record 读写demo , 建议开启 参数with_share_memory , 将会大大加速文件读取  

```python
import tfrecords
options = tfrecords.TFRecordOptions(compression_type=tfrecords.TFRecordCompressionType.NONE)

def test_write(filename, N=3, context='aaa'):
    with tfrecords.TFRecordWriter(filename, options=options) as file_writer:
        for _ in range(N):
            # x, y = np.random.random(), np.random.random()
            file_writer.write(context + '____' + str(_))

def test_record_iterator(example_paths):
    print('test_record_iterator')
    for example_path in example_paths:
        iterator = tfrecords.tf_record_iterator(example_path, options=options,with_share_memory=False)
        num = 0
        for iter in iterator:
            num += 1
            print(iter)

def test_random_reader(example_paths):
    print('test_random_reader')
    for example_path in example_paths:
        file_reader = tfrecords.tf_record_random_reader(example_path, options=options,with_share_memory=False)
        last_pos = 0
        while True:
            try:
                x,pos = file_reader.read(last_pos)
                print(x)
                last_pos = pos
            except:
                break

def test_random_reader2(example_paths):
    print('test_random_reader2')
    for example_path in example_paths:
        file_reader = tfrecords.tf_record_random_reader(example_path, options=options,with_share_memory=False)
        skip_bytes = 0
        offset_list = file_reader.read_offsets(skip_bytes)
        for offset,length in offset_list:
            x, _ = file_reader.read(offset)
            print(x)

test_write('d:/example.tfrecords0',3,'file0')
test_write('d:/example.tfrecords1',10,'file1')
test_write('d:/example.tfrecords2',12,'file2')
example_paths = tfrecords.glob('d:/example.tfrecords*')


test_record_iterator(example_paths)
print()
test_random_reader2(example_paths)
print()
test_random_reader2(example_paths)
print()
```


