Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

UaFile: Comfort method to write file #1126

Merged
merged 2 commits into from
Nov 20, 2022
Merged

Conversation

martigr
Copy link
Contributor

@martigr martigr commented Nov 18, 2022

Currently it is possible to write a file with the low-level file services, but the comfort function in asyncua/client/ua_file.py is missing.

I have now added the write() method to asyncua/client/ua_file.py. This allows to write a file comfortably without using the low-level functions.

async with Client(url=url) as client:
    file_node = client.get_node("ns=2;s=NameOfNode")
    async with UaFile(file_node, 'w') as ua_file:
        with open("filename.txt", 'rb') as fd:
            data = fd.read()
            await ua_file.write(data) # write file

@AndreasHeine
Copy link
Member

what if the file is larger then the avalible memmory?

using a iterator of a fileobject:

for line in file:
    print(line)

@schroeder-
Copy link
Contributor

what if the file is larger then the avalible memmory?

This is not a problem because most opc ua servers doesn't allow more than 16mb of data in one service call.

@martigr
Copy link
Contributor Author

martigr commented Nov 18, 2022

what if the file is larger then the avalible memmory?

using a iterator of a fileobject:

for line in file:
    print(line)

The code snippet in my comment is just an example of the usage.
I agree that some memory issues could occur. In a real world application writing in chunks would be better.

@AndreasHeine
Copy link
Member

what if the file is larger then the avalible memmory?

This is not a problem because most opc ua servers doesn't allow more than 16mb of data in one service call.

the thing i am thinking of is do i really need the whole file in ram (maybe not the only file ^^)
maybe i am overconcerned...

@oroulet oroulet merged commit 3e63cea into FreeOpcUa:master Nov 20, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants