bedapudi6788 / deepcorrect Goto Github PK
View Code? Open in Web Editor NEWText and Punctuation correction with Deep Learning
License: GNU General Public License v3.0
Text and Punctuation correction with Deep Learning
License: GNU General Public License v3.0
1608 except errors.InvalidArgumentError as e:
1609 # Convert to ValueError for backwards compatibility.
-> 1610 raise ValueError(str(e))
1611
1612 return c_op
ValueError: Dimension 0 in both shapes must be equal, but are 2 and 98. Shapes are [2,256] and [98,256]. for 'Assign' (op: 'Assign') with input shapes: [2,256], [98,256].
After installing the TensorFlow version 1.14.0 and deepcorrect, I am running the example.
But I am getting the following error.
ValueError: Dimension 0 in both shapes must be equal, but are 2 and 98. Shapes are [2,256] and [98,256]. for 'Assign' (op: 'Assign') with input shapes: [2,256], [98,256].
Hello,
I just wanted to tell you that this module is no longer working with the version 2.0 of TensorFlow.
I am getting the following error when importing:
AttributeError: module 'tensorflow' has no attribute 'set_random_seed'
The full traceback:
Traceback (most recent call last):
File "<input>", line 1, in <module>
File "/Applications/PyCharm CE with Anaconda plugin.app/Contents/helpers/pydev/_pydev_bundle/pydev_import_hook.py", line 21, in do_import
module = self._system_import(name, *args, **kwargs)
File "/anaconda3/envs/ziotag_metis_env/lib/python3.7/site-packages/deepcorrect/__init__.py", line 1, in <module>
from .deepcorrect import DeepCorrect
File "/Applications/PyCharm CE with Anaconda plugin.app/Contents/helpers/pydev/_pydev_bundle/pydev_import_hook.py", line 21, in do_import
module = self._system_import(name, *args, **kwargs)
File "/anaconda3/envs/ziotag_metis_env/lib/python3.7/site-packages/deepcorrect/deepcorrect.py", line 1, in <module>
from txt2txt import build_model, infer
File "/Applications/PyCharm CE with Anaconda plugin.app/Contents/helpers/pydev/_pydev_bundle/pydev_import_hook.py", line 21, in do_import
module = self._system_import(name, *args, **kwargs)
File "/anaconda3/envs/ziotag_metis_env/lib/python3.7/site-packages/txt2txt/__init__.py", line 3, in <module>
from .txt2txt import *
File "/Applications/PyCharm CE with Anaconda plugin.app/Contents/helpers/pydev/_pydev_bundle/pydev_import_hook.py", line 21, in do_import
module = self._system_import(name, *args, **kwargs)
File "/anaconda3/envs/ziotag_metis_env/lib/python3.7/site-packages/txt2txt/txt2txt.py", line 8, in <module>
tf.set_random_seed(6788)
AttributeError: module 'tensorflow' has no attribute 'set_random_seed'
https://colab.research.google.com/drive/10EcCIY91VjETzw_IYoLmajyrGvQrzZpB
You can run this codes.You will see this error:
OSError Traceback (most recent call last)
<ipython-input-17-96cc841e4fb1> in <module>()
1 from deepcorrect import DeepCorrect
----> 2 corrector = DeepCorrect('/content/sample_data/deeppunct_params_en', '/content/sample_data/deeppunct_checkpoint_wikipedia')
3 corrector.correct('hey')
4 frames
/usr/local/lib/python3.6/dist-packages/h5py/_hl/files.py in make_fid(name, mode, userblock_size, fapl, fcpl, swmr)
140 if swmr and swmr_support:
141 flags |= h5f.ACC_SWMR_READ
--> 142 fid = h5f.open(name, flags, fapl=fapl)
143 elif mode == 'r+':
144 fid = h5f.open(name, h5f.ACC_RDWR, fapl=fapl)
h5py/_objects.pyx in h5py._objects.with_phil.wrapper()
h5py/_objects.pyx in h5py._objects.with_phil.wrapper()
h5py/h5f.pyx in h5py.h5f.open()
OSError: Unable to open file (truncated file: eof = 6291456, sblock->base_addr = 0, stored_eof = 10912808)
Help!
I want to update or tutorial etc. for tensorflow serving.
Hi, I read your post https://medium.com/@praneethbedapudi/deepcorrection-3-spell-correction-and-simple-grammar-correction-d033a52bc11d and I am trying this pertained model with deeppunct_params_en and deeppunct_checkpoint_google_news/ deeppunct_checkpoint_tatoeba_cornell/ deeppunct_checkpoint_wikipedia respectively, however instead of getting:
INPUT: iwill be there four u
OUTPUT: I will be there for you.
I got:
INPUT: iwill be there four u
OUTPUT: Iwill be there four ?
Any suggestion to get result as good as yours?
Hi, I can't reproduce the same result using the code as Demo
The system environment is Python 3.7.3, TensorFlow 1.14.0, txt2txt 1.0.9,
I used below parameter as it says in README.md, the params and checkpoint are download at https://drive.google.com/open?id=1Yd8cJaqfQkrJMbRVWIWtuyo4obTDYu-e
def __init__(self, params_path, checkpoint_path): DeepCorrect.deepcorrect_model = build_model("./deeppunct_params_en") DeepCorrect.deepcorrect_model[0].load_weights("./deeppunct_checkpoint_google_news")
I tried two sentence, the result comparison is as follow:
input: hey
my output: [{'sequence': 'Hey?', 'prob': 0.6868892775191712}]
Demo output: "deep-segment_punct": [
"Hey."
]
input: 'Why you did this to me I hate you you are dead to me'
my output: [{'sequence': 'Why you did this to me I hate you you are dead to me.', 'prob': 0.6264143199088837}]
Demo output: "deep-segment_punct": [
"Why you did this to me? I hate you, You are dead to me."
]
Where could the problem be? How can I get the same result as it in DeepPunct
If you need more details to locate the error, please tell me.
Thanks
Hi,
I am using the deeppunct_params_en
in README.md
from https://drive.google.com/drive/folders/1Yd8cJaqfQkrJMbRVWIWtuyo4obTDYu-e
and got error
178 if os.path.exists(params_path):
179 print('Loading the params file')
--> 180 params = pickle.load(open(params_path, 'rb'))
181 return params
182
UnpicklingError: invalid load key, 'H'.
Error can be reproduced on Python 3.5, 3.6 and 3.7.
Are there any change introduced in the params file?
Thanks,
Once I write the same code as mentioned, I am getting above mentioned error.
How to go ahead?
Hello,
This is great! Any chance there could be a node version?
Happy to help writing python into node js if there are no obvious limitations or blockers, and with some guidance on what to look out for.
Hi,
Thanks for this package. I just tried running it with the following code:
corrector = DeepCorrect('./deeppunct_params_en', './deeppunct_checkpoint_wikipedia')
However I get this error:
`
corrector = DeepCorrect('./deeppunct_params_en', './deeppunct_checkpoint_wikipedia')
Loading the params file
Input encoding {'o': 2, '{': 3, '.': 4, 'J': 5, '0': 6, '1': 7, '<': 8, 'B': 9, 'd': 10, '£': 11, 'e': 12, '6': 13, '!': 14, 'O': 15, 'M': 16, 'X': 17, 'f': 18, 't': 19, 'C': 20, 'V': 21, 'z': 22, 'K': 23, '\': 24, '9': 25, 'P': 26, 'S': 27, '/': 28, '₹': 29, 'F': 30, 'G': 31, '=': 32, '8': 33, ')': 34, '+': 35, ']': 36, 'U': 37, "'": 38, '"': 39, 'g': 40, 'N': 41, 'r': 42, 'u': 43, '&': 44, '$': 45, 'x': 46, '%': 47, ':': 48, '@': 49, '^': 50, 'I': 51, 'L': 52, 'Z': 53, 'h': 54, 'W': 55, 'A': 56, 'v': 57, '?': 58, '2': 59, '': 60, 's': 61, 'T': 62, 'R': 63, ',': 64, '|': 65, '4': 66, '>': 67, 'y': 68, '(': 69, '[': 70, 'k': 71, 'H': 72, 'l': 73, 'j': 74, '7': 75, 'n': 76, 'i': 77, 'D': 78, 'Q': 79, ' ': 80, 'm': 81, 'Y': 82, '*': 83, '}': 84, '#': 85, 'p': 86, 'q': 87, '5': 88, 'c': 89, '': 60, 's': 61, 'T': 62, 'R': 63, ',': 64, '|': 65, '4': 66, '>': 67, 'y': 68, '(': 69, '[': 70, 'k': 71, 'H': 72, 'l': 73, 'j': 74, '7': 75, 'n': 76, 'i': 77, 'D': 78, 'Q': 79, ' ': 80, 'm': 81, 'Y': 82, '*': 83, '}': 84, '#': 85, 'p': 86, 'q': 87, '5': 88, 'c': 89, '': 90, 'a': 91, 'b': 92, 'w': 93, '3': 94, 'E': 95, ';': 96, '-': 97} Input decoding {2: 'o', 3: '{', 4: '.', 5: 'J', 6: '0', 7: '1', 8: '<', 9: 'B', 10: 'd', 11: '£', 12: 'e', 13: '6', 14: '!', 15: 'O', 16: 'M', 17: 'X', 18: 'f', 19: 't', 20: 'C', 21: 'V', 22: 'z', 23: 'K', 24: '\\', 25: '9', 26: 'P', 27: 'S', 28: '/', 29: '₹', 30: 'F', 31: 'G', 32: '=', 33: '8', 34: ')', 35: '+', 36: ']', 37: 'U', 38: "'", 39: '"', 40: 'g', 41: 'N', 42: 'r', 43: 'u', 44: '&', 45: '$', 46: 'x', 47: '%', 48: ':', 49: '@', 50: '^', 51: 'I', 52: 'L', 53: 'Z', 54: 'h', 55: 'W', 56: 'A', 57: 'v', 58: '?', 59: '2', 60: '~', 61: 's', 62: 'T', 63: 'R', 64: ',', 65: '|', 66: '4', 67: '>', 68: 'y', 69: '(', 70: '[', 71: 'k', 72: 'H', 73: 'l', 74: 'j', 75: '7', 76: 'n', 77: 'i', 78: 'D', 79: 'Q', 80: ' ', 81: 'm', 82: 'Y', 83: '*', 84: '}', 85: '#', 86: 'p', 87: 'q', 88: '5', 89: 'c', 90: '
', 91: 'a', 92: 'b', 93: 'w', 94: '3', 95: 'E', 96: ';', 97: '-'}
Output encoding {'o': 2, '{': 3, '.': 4, 'J': 5, '0': 6, '1': 7, '<': 8, 'B': 9, 'd': 10, '£': 11, 'e': 12, '6': 13, '!': 14, 'O': 15, 'M': 16, 'X': 17, 'f': 18, 't': 19, 'C': 20, 'V': 21, 'z': 22, 'K': 23, '\': 24, '9': 25, 'P': 26, 'S': 27, '/': 28, '₹': 29, 'F': 30, 'G': 31, '=': 32, '8': 33, ')': 34, '+': 35, ']': 36, 'U': 37, "'": 38, '"': 39, 'g': 40, 'N': 41, 'r': 42, 'u': 43, '&': 44, '$': 45, 'x': 46, '%': 47, ':': 48, '@': 49, '^': 50, 'I': 51, 'L': 52, 'Z': 53, 'h': 54, 'W': 55, 'A': 56, 'v': 57, '?': 58, '2': 59, '': 90, 'a': 91, 'b': 92, 'w': 93, '3': 94, 'E': 95, ';': 96, '-': 97} Output decoding {2: 'o', 3: '{', 4: '.', 5: 'J', 6: '0', 7: '1', 8: '<', 9: 'B', 10: 'd', 11: '£', 12: 'e', 13: '6', 14: '!', 15: 'O', 16: 'M', 17: 'X', 18: 'f', 19: 't', 20: 'C', 21: 'V', 22: 'z', 23: 'K', 24: '\\', 25: '9', 26: 'P', 27: 'S', 28: '/', 29: '₹', 30: 'F', 31: 'G', 32: '=', 33: '8', 34: ')', 35: '+', 36: ']', 37: 'U', 38: "'", 39: '"', 40: 'g', 41: 'N', 42: 'r', 43: 'u', 44: '&', 45: '$', 46: 'x', 47: '%', 48: ':', 49: '@', 50: '^', 51: 'I', 52: 'L', 53: 'Z', 54: 'h', 55: 'W', 56: 'A', 57: 'v', 58: '?', 59: '2', 60: '~', 61: 's', 62: 'T', 63: 'R', 64: ',', 65: '|', 66: '4', 67: '>', 68: 'y', 69: '(', 70: '[', 71: 'k', 72: 'H', 73: 'l', 74: 'j', 75: '7', 76: 'n', 77: 'i', 78: 'D', 79: 'Q', 80: ' ', 81: 'm', 82: 'Y', 83: '*', 84: '}', 85: '#', 86: 'p', 87: 'q', 88: '5', 89: 'c', 90: '
', 91: 'a', 92: 'b', 93: 'w', 94: '3', 95: 'E', 96: ';', 97: '-'}
Traceback (most recent call last):
File "", line 1, in
File "/Users/abhishek.shivkumar/.local/share/virtualenvs/deepsegment-87kORHDC/lib/python3.7/site-packages/deepcorrect-1.0.5-py3.7.egg/deepcorrect/deepcorrect.py", line 9, in init
File "/Users/abhishek.shivkumar/.local/share/virtualenvs/deepsegment-87kORHDC/lib/python3.7/site-packages/Keras-2.3.1-py3.7.egg/keras/engine/saving.py", line 492, in load_wrapper
return load_function(*args, **kwargs)
File "/Users/abhishek.shivkumar/.local/share/virtualenvs/deepsegment-87kORHDC/lib/python3.7/site-packages/Keras-2.3.1-py3.7.egg/keras/engine/network.py", line 1230, in load_weights
f, self.layers, reshape=reshape)
File "/Users/abhishek.shivkumar/.local/share/virtualenvs/deepsegment-87kORHDC/lib/python3.7/site-packages/Keras-2.3.1-py3.7.egg/keras/engine/saving.py", line 1235, in load_weights_from_hdf5_group
' elements.')
ValueError: Layer #1 (named "bidirectional_6" in the current model) was found to correspond to layer embedding_1 in the save file. However the new layer bidirectional_6 expects 6 weights, but the saved weights have 1 elements.
`
Can you please help me solve this?
Hi,
I am getting the following error: 'OSError: Unable to open file (unable to open file: name = 'checkpoint_path', errno = 2, error message = 'No such file or directory', flags = 0, o_flags = 0)'
Do you know how to fix it?
Kind regards!
Could you please tell me the steps needed to run deepcorrect, after cloning you repo.
I would like to use the pre-trained model of your demo. Where is that stored and where do I need to load it? I will like to test is on unpunctuated text.
What limits the sentence length in/out to 200 ? is it because of the decoder? How can I change that to be let's say 500?
Thank you!
input_data (list): List of input strings.
output_data (list): List of output strings.
In your model you use data like :
list of tokens - example ['Hi,'how','are','you?','Great',,'thanks!']
or
list of sentences - example ['Hi, how are you?','Great, thanks!']
AND
What are the maximum lengths you used?
max_lenghts=(?, ?)
Thanks in advance for answering a stupid question!
Hi, I am interested in using your tool.
I have installed it via pip3, and tried your sample code:
from deepcorrect import DeepCorrect
corrector = DeepCorrect('params_path', 'checkpoint_path')
corrector.correct('hey')
'Hey!'
The problem is that
corrector = DeepCorrect('params_path', 'checkpoint_path')
returns the following message with any of the 3 pre-trained checkpoints:
Loading the params file
Input encoding {'o': 2, '{': 3, '.': 4, 'J': 5, '0': 6, '1': 7, '<': 8, 'B': 9, 'd': 10, '£': 11, 'e': 12, '6': 13, '!': 14, 'O': 15, 'M': 16, 'X': 17, 'f': 18, 't': 19, 'C': 20, 'V': 21, 'z': 22, 'K': 23, '\': 24, '9': 25, 'P': 26, 'S': 27, '/': 28, '₹': 29, 'F': 30, 'G': 31, '=': 32, '8': 33, ')': 34, '+': 35, ']': 36, 'U': 37, "'": 38, '"': 39, 'g': 40, 'N': 41, 'r': 42, 'u': 43, '&': 44, '$': 45, 'x': 46, '%': 47, ':': 48, '@': 49, '^': 50, 'I': 51, 'L': 52, 'Z': 53, 'h': 54, 'W': 55, 'A': 56, 'v': 57, '?': 58, '2': 59, '': 60, 's': 61, 'T': 62, 'R': 63, ',': 64, '|': 65, '4': 66, '>': 67, 'y': 68, '(': 69, '[': 70, 'k': 71, 'H': 72, 'l': 73, 'j': 74, '7': 75, 'n': 76, 'i': 77, 'D': 78, 'Q': 79, ' ': 80, 'm': 81, 'Y': 82, '*': 83, '}': 84, '#': 85, 'p': 86, 'q': 87, '5': 88, 'c': 89, '': 60, 's': 61, 'T': 62, 'R': 63, ',': 64, '|': 65, '4': 66, '>': 67, 'y': 68, '(': 69, '[': 70, 'k': 71, 'H': 72, 'l': 73, 'j': 74, '7': 75, 'n': 76, 'i': 77, 'D': 78, 'Q': 79, ' ': 80, 'm': 81, 'Y': 82, '*': 83, '}': 84, '#': 85, 'p': 86, 'q': 87, '5': 88, 'c': 89, '': 90, 'a': 91, 'b': 92, 'w': 93, '3': 94, 'E': 95, ';': 96, '-': 97} Input decoding {2: 'o', 3: '{', 4: '.', 5: 'J', 6: '0', 7: '1', 8: '<', 9: 'B', 10: 'd', 11: '£', 12: 'e', 13: '6', 14: '!', 15: 'O', 16: 'M', 17: 'X', 18: 'f', 19: 't', 20: 'C', 21: 'V', 22: 'z', 23: 'K', 24: '\\', 25: '9', 26: 'P', 27: 'S', 28: '/', 29: '₹', 30: 'F', 31: 'G', 32: '=', 33: '8', 34: ')', 35: '+', 36: ']', 37: 'U', 38: "'", 39: '"', 40: 'g', 41: 'N', 42: 'r', 43: 'u', 44: '&', 45: '$', 46: 'x', 47: '%', 48: ':', 49: '@', 50: '^', 51: 'I', 52: 'L', 53: 'Z', 54: 'h', 55: 'W', 56: 'A', 57: 'v', 58: '?', 59: '2', 60: '~', 61: 's', 62: 'T', 63: 'R', 64: ',', 65: '|', 66: '4', 67: '>', 68: 'y', 69: '(', 70: '[', 71: 'k', 72: 'H', 73: 'l', 74: 'j', 75: '7', 76: 'n', 77: 'i', 78: 'D', 79: 'Q', 80: ' ', 81: 'm', 82: 'Y', 83: '*', 84: '}', 85: '#', 86: 'p', 87: 'q', 88: '5', 89: 'c', 90: '
', 91: 'a', 92: 'b', 93: 'w', 94: '3', 95: 'E', 96: ';', 97: '-'}
Output encoding {'o': 2, '{': 3, '.': 4, 'J': 5, '0': 6, '1': 7, '<': 8, 'B': 9, 'd': 10, '£': 11, 'e': 12, '6': 13, '!': 14, 'O': 15, 'M': 16, 'X': 17, 'f': 18, 't': 19, 'C': 20, 'V': 21, 'z': 22, 'K': 23, '\': 24, '9': 25, 'P': 26, 'S': 27, '/': 28, '₹': 29, 'F': 30, 'G': 31, '=': 32, '8': 33, ')': 34, '+': 35, ']': 36, 'U': 37, "'": 38, '"': 39, 'g': 40, 'N': 41, 'r': 42, 'u': 43, '&': 44, '$': 45, 'x': 46, '%': 47, ':': 48, '@': 49, '^': 50, 'I': 51, 'L': 52, 'Z': 53, 'h': 54, 'W': 55, 'A': 56, 'v': 57, '?': 58, '2': 59, '': 90, 'a': 91, 'b': 92, 'w': 93, '3': 94, 'E': 95, ';': 96, '-': 97} Output decoding {2: 'o', 3: '{', 4: '.', 5: 'J', 6: '0', 7: '1', 8: '<', 9: 'B', 10: 'd', 11: '£', 12: 'e', 13: '6', 14: '!', 15: 'O', 16: 'M', 17: 'X', 18: 'f', 19: 't', 20: 'C', 21: 'V', 22: 'z', 23: 'K', 24: '\\', 25: '9', 26: 'P', 27: 'S', 28: '/', 29: '₹', 30: 'F', 31: 'G', 32: '=', 33: '8', 34: ')', 35: '+', 36: ']', 37: 'U', 38: "'", 39: '"', 40: 'g', 41: 'N', 42: 'r', 43: 'u', 44: '&', 45: '$', 46: 'x', 47: '%', 48: ':', 49: '@', 50: '^', 51: 'I', 52: 'L', 53: 'Z', 54: 'h', 55: 'W', 56: 'A', 57: 'v', 58: '?', 59: '2', 60: '~', 61: 's', 62: 'T', 63: 'R', 64: ',', 65: '|', 66: '4', 67: '>', 68: 'y', 69: '(', 70: '[', 71: 'k', 72: 'H', 73: 'l', 74: 'j', 75: '7', 76: 'n', 77: 'i', 78: 'D', 79: 'Q', 80: ' ', 81: 'm', 82: 'Y', 83: '*', 84: '}', 85: '#', 86: 'p', 87: 'q', 88: '5', 89: 'c', 90: '
', 91: 'a', 92: 'b', 93: 'w', 94: '3', 95: 'E', 96: ';', 97: '-'}
Traceback (most recent call last):
File "", line 1, in
File "/home/toliz/.local/lib/python3.6/site-packages/deepcorrect/deepcorrect.py", line 8, in init
DeepCorrect.deepcorrect_model = build_model(params_path)
File "/home/toliz/.local/lib/python3.6/site-packages/txt2txt/txt2txt.py", line 204, in build_model
decoder = LSTM(2 * enc_lstm_units, return_sequences=True, unroll=unroll)(decoder, initial_state=[encoder_last, encoder_last])
File "/home/toliz/.local/lib/python3.6/site-packages/keras/layers/recurrent.py", line 574, in call
return super(RNN, self).call(inputs, **kwargs)
File "/home/toliz/.local/lib/python3.6/site-packages/keras/engine/base_layer.py", line 431, in call
self.build(unpack_singleton(input_shapes))
File "/home/toliz/.local/lib/python3.6/site-packages/keras/layers/recurrent.py", line 503, in build
if [spec.shape[-1] for spec in self.state_spec] != state_size:
File "/home/toliz/.local/lib/python3.6/site-packages/keras/layers/recurrent.py", line 503, in
if [spec.shape[-1] for spec in self.state_spec] != state_size:
TypeError: 'NoneType' object is not subscriptable
Could you please help me? I am using Ubuntu 18.04 and python 3.6
when i
from deepcorrect import DeepCorrect
corrector = DeepCorrect(deeppunct_params_en, deeppunct_checkpoint_tatoeba_cornell)
corrector.correct('hey')
it is giving me "hey."
and when i
corrector.correct('hello how are you what is your name')
it gives me 'Hello, how are you what is your name.'
no question marks , why?.
please help me what should i do?
from deepcorrect import DeepCorrect
checkpoint_path = "./deep_punct_v2_model/deeppunct_checkpoint_wikipedia"
params_path = "./deep_punct_v2_model/deeppunct_params_en"
corrector = DeepCorrect(params_path, checkpoint_path)
corrector.correct('how are you')
This is my output:
[{'sequence': 'How are you?', 'prob': 0.9735229413488089}]
I only want to print the 'How are you?' without printing the sequence and prob. What should I do?
Hi. I want add ukraine language to it. I cannot found tutorial for train model manually
Hi, I tried training txt2txt
on a German dataset for punctuation correction. Then used deepcorrect
to punctuate the text, but the results are incorrect. Could you please guide on this?
I passed the training data itself. But output is consistently 3 words. Please guide.
corrector.correct("in keiner dieser debatten wurden diese grundsätze irgendwie bestritten oder angezweifelt")
[[{'sequence': 'in keiner', 'prob': 0.26473349747315716}]]
corrector.correct("es geht mir in diesem kontext darum spielraum zu schaffen für zwei dinge")
[[{'sequence': 'es geht m', 'prob': 0.1473360733543895}]]
from deepcorrect import DeepCorrect corrector = DeepCorrect('params_path', 'checkpoint_path') corrector.correct('how are you')
im getting error like
`Traceback (most recent call last):
File "C:\Users\skullcandy\AppData\Roaming\Python\Python38\site-packages\IPython\core\interactiveshell.py", line 3331, in run_code
exec(code_obj, self.user_global_ns, self.user_ns)
File "", line 1, in
from deepcorrect import DeepCorrect
File "C:\Users\skullcandy\AppData\Local\Programs\Python\Python38-32\lib\site-packages\deepcorrect_init_.py", line 1, in
from .deepcorrect import DeepCorrect
File "C:\Users\skullcandy\AppData\Local\Programs\Python\Python38-32\lib\site-packages\deepcorrect\deepcorrect.py", line 1, in
from txt2txt import build_model, infer
File "C:\Users\skullcandy\AppData\Local\Programs\Python\Python38-32\lib\site-packages\txt2txt_init_.py", line 3, in
from .txt2txt import *
File "C:\Users\skullcandy\AppData\Local\Programs\Python\Python38-32\lib\site-packages\txt2txt\txt2txt.py", line 7, in
import tensorflow as tf
File "C:\Users\skullcandy\AppData\Local\Programs\Python\Python38-32\lib\site-packages\tensorflow_init_.py", line 24, in
from tensorflow.python import pywrap_tensorflow # pylint: disable=unused-import
File "C:\Users\skullcandy\AppData\Local\Programs\Python\Python38-32\lib\site-packages\tensorflow\python_init_.py", line 49, in
from tensorflow.python import pywrap_tensorflow
File "C:\Users\skullcandy\AppData\Local\Programs\Python\Python38-32\lib\site-packages\tensorflow\python\pywrap_tensorflow.py", line 58, in
from tensorflow.python.pywrap_tensorflow_internal import *
File "C:\Users\skullcandy\AppData\Local\Programs\Python\Python38-32\lib\site-packages\tensorflow\python\pywrap_tensorflow_internal.py", line 114
def TFE_ContextOptionsSetAsync(arg1, async):
^
SyntaxError: invalid syntax
`
Can you help me in using it
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.