Coder Social home page Coder Social logo

gr-azure's Introduction

Azure software radio Out of Tree Module

The gr-azure Out of Tree (OOT) Module allows users to easily leverage Azure cloud resources from within a GNU Radio flowgraph. You can use this OOT module with your existing GNU Radio development environment, or within a VM in the cloud. Example use-cases involve storing and retrieving RF recordings from Blob (file) storage or creating complex cloud applications using Azure Event Hubs as a bridge between your flowgraph and Azure services. We are excited to see what can be created by combining GNU Radio with the power and scalability of the cloud!

For information on our GNU Radio developer VM available in Azure, see this guide. We also have a set of tutorials that use the developer VM and gr-azure.

Table of Contents

Getting Started

The following installation instructions will get you up and running with the Azure OOT Module on your local machine.

Prerequisites

This project depends on the GNU Radio 3.9.x or 3.10.x runtime and development dependencies. See the GNU Radio installation instructions for steps on installing GNU Radio from binaries (note that GNU Radio packaged with Ubuntu 20 is only 3.8). Some package managers do not automatically install all of the development dependencies, so you may need to separately install and configure some of them. The Azure software radio OOT module requires the following:

  • GNU Radio 3.9.x or 3.10.x
  • Python 3.8 or greater
  • cmake
  • liborc-dev
  • doxygen
  • pytest
  • pybind11
  • Additional Python packages are listed in python/requirements.txt

See the installation steps below for how to install these dependencies.

NOTE: If using the Azure CLI, you will need version 2.17.1 or newer. This module is not compatible with the Azure CLI availabile in the default apt repository on Ubuntu 20. If this older version of the Azure CLI is present on your system, the installation of this OOT module may fail or the module may crash at runtime. Please install the Azure CLI according to the recommendations found in AZ CLI Installation in Linux.

Installing Azure software radio OOT

The following steps show how to install this OOT module on a Debian-based OS with GNU Radio already installed. They have been tested to work under Ubuntu 20. If you see error messages after running any of the following steps, stop and check our FAQ for how to resolve the problem.

sudo apt-get install python3-pip cmake liborc-dev doxygen
sudo pip install pytest pybind11

git clone https://github.com/microsoft/gr-azure.git
cd gr-azure

sudo pip install -r python/requirements.txt

mkdir build
cd build
cmake ..
make -j4
sudo make install
sudo ldconfig

(If you run into a non-existent path error after cmake .., try recreating your build directory and use cmake -DCMAKE_FIND_ROOT_PATH=/usr .. instead)

At this point the OOT module should have been installed, and you should see additional blocks within GNU Radio Companion.

Running the Unit Tests

If you would like to run the QA tests, there are two methods:

  1. From within the build directory, run:

    make test
    

    You can review detailed test output (including any failures) in Testing/Temporary/LastTest.log.

  2. From within the python directory, run:

    python -m pytest qa_*
    

    Pytest will show detailed test results directly in the output of this command.

Resolutions to Common Problems During Installation and Tests

For a list common problems and resolutions, please check our FAQ to see if your issue has been addressed.

Examples

The examples folder has a collection of flowgraphs and supporting files that illustrate common ways of using the blocks provided in this module. See the README in the examples folder to get started.

Azure software radio Out of Tree Module Blocks

Key Vault Block

The Key Vault block allows users to pull down keys and secrets from an Azure Key Vault in GNU Radio. It is expected that the user will setup and store secrets in an Azure Key Vault prior to pulling down keys using this block. To create a Key Vault, see Create Key Vault.

For a brief tutorial on using this block, see the Key Vault Example.

Blob Blocks

The two Blob blocks (source and sink) provide an interface to read and write samples to Azure Blob storage in GNU Radio. Note that the SigMF Blob Source/Sink are simply wrappers around the regular Blob Source/Sink with SigMF mode set to True. It is expected that the user will setup a storage account and a container prior to accessing Blob storage with the Blob source and sink blocks. To create a storage account, see Create Storage Account.

  • Blob Source Block & SigMF Blob Source Block
    The Blob source block reads samples from Azure Blob storage. This block currently supports block blobs and the following outputs: complex, float, int, short and byte (Page blobs and append blobs are not supported at this time).

  • Blob Sink Block & SigMF Blob Sink Block
    The Blob sink block writes samples to Azure Blob storage. This block currently supports block blobs and the following inputs: complex, float, int, short and byte (Page blobs and append blobs are not supported at this time).

There are several ways to authenticate to the Azure blob backend, these blocks support authentication using a connection string, a URL with an embedded SAS token, or use credentials supported by the DefaultAzureCredential class.

To determine the max speed at which samples can be downloaded or uploaded to/from Blob storage, for different Azure regions, run https://www.azurespeed.com/Azure/Download or https://www.azurespeed.com/Azure/Upload on the VM or machine running GNU Radio.

For a brief tutorial on using these blocks, see the Blob Examples.

Event Hub Blocks

The Event Hub blocks (source and sink) provide an interface to send and receive events to Azure Event Hubs using the message passing interface in GNU Radio. It is expected that the user will create an Event Hubs namespace, Event Hub entity and consumer group prior to using the Event Hub source and sink blocks. To create an Event Hub, see Create an Event Hub.

  • EventHub Source Block
    The EventHub source block receives a JSON formatted event message from Azure Event Hub and converts it to GNU Radio PMT format.

  • EventHub Sink Block
    The EventHub sink block converts a PMT message to JSON and sends it to Azure Event Hub.

These blocks support multiple ways to authenticate to the Azure Event Hub backend, such as using a connection string, a SAS token, or use credentials supported by the DefaultAzureCredential class.

For a brief tutorial on using these blocks, see the Event Hub Examples.

REST API Block

The REST API block allows users to get status and configure a running top block in GNU Radio. It starts a server in the configured port and restricts which settings and variables in a flowgraph are readable, writable or callable.

To get status from a flowgraph, a user can hit the status endpoint as follow

curl -X GET http://<IP>:<port>/status

To configure or write the exposed variables, the following command

curl -X PUT http://<IP>:<port>/config -H 'Content-Type: application/json' -d '{"<variable>":<value>}'

To execute a callback or function within a top block, use the following route

curl -X PUT http://<IP>:<port>/call -H 'Content-Type: application/json' -d '{"<function name>":<parameter>}'

For a brief tutorial on using this block, see the REST API Example.

Frequently Asked Questions

For a list of common questions, including problems and resolutions, please check our FAQ

Support

This project uses GitHub Issues to track bugs and feature requests. Please refer to our Support Guide for more details.

Before filing a new issue, please check our FAQ for potential solutions to common problems.

Starting with GNU Radio maint-3.9, this project will support the same set of maintenance branches tracked by GNU Radio.

Contributing

Contributing Guide

License

This project is licensed under the GNU General Public License v3.0 or later - see the LICENSE file for details

Trademarks

This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.

gr-azure's People

Contributors

777arc avatar dependabot[bot] avatar fiu-elf avatar jo-rivera avatar microsoft-github-operations[bot] avatar microsoft-github-policy-service[bot] avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

gr-azure's Issues

File Caching with Blobs

Some users may want to cache blob file contents if they intend to re-run the same processing multiple times without re-downloading the same file over and over.

It may be useful to include some pointers in the documentation on how to handle these use cases, possibly by using something like a Lustre file system and file source blocks if users want to flexibly work with a large number of blobs, or using AzCopy to manually copy smaller numbers of blobs and again using file source blocks.

https://azure.microsoft.com/en-us/updates/lustre-hsm-tools-now-available-to-import-from-or-export-to-azure-storage/

https://docs.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10

AB#10542

Feature Request: Blob Source/Sink Rate Speed Test

As I've been using the blob blocks, mainly the source block, I have to sort of manually figure out what max rates I can achieve, to be able to stream samples at 100% rate without delays mixed in. When I'm on a VM in Azure I think the rate is limited by the bandwidth between the VM and blob storage, which is much higher if the blob storage account is in a different region than the VM, which for some applications could break the flowgraph just from moving from West to East for example. Standard vs premium storage account may also impact it. So one idea is a block with no inputs or outputs that you can plop into a flowgraph and hit run, that runs a simple test to see how fast it's able to read and write from blob storage, like UHD's benchmark_rate tool. It could also just be a cmd line utility that comes with the OOT, not a block, but if it's a block then people who don't read any docs will still see it =). I don't know if there's a utility built into Azure already to do this that we can just call and spit out the results to the console, but even if it has to be done manually on the Source side it shouldn't be too hard, you just point to an example IQ file in blob storage and read it into a flowgraph with no throttle, into a null sink or something, then probe rate. We could even set up storage accounts on all the major regions and then in the block params, have a drop-down param to choose the region to use for the speed test, that way they don't need a storage account already set up to run the test, and there are no block params to fiddle with to get it to run. Just an idea that seems like it might be useful to folks who want to use blob storage for real applications.

AB#10544

Cannot build Azure Software Radio OOT modules in Azure Software Radio Development VM

I cloned the repository in the Azure Development VM, and I tried to build the OOT modules but I'm getting this error after running cmake ..:

-- The CXX compiler identification is GNU 9.3.0
-- The C compiler identification is GNU 9.3.0
-- Check for working CXX compiler: /bin/c++
-- Check for working CXX compiler: /bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Check for working C compiler: /bin/cc
-- Check for working C compiler: /bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Build type not specified: defaulting to release.
-- Found PkgConfig: /bin/pkg-config (found version "0.29.1") 
-- Found LOG4CPP: /usr/lib/x86_64-linux-gnu/liblog4cpp.so
-- Checking for module 'gmp'
--   Found gmp, version 6.2.0
-- Found GMP: /usr/lib/x86_64-linux-gnu/libgmpxx.so  
-- Using GMP.
-- Found MPLIB: /usr/lib/x86_64-linux-gnu/libgmpxx.so  
-- Found Boost: /lib/x86_64-linux-gnu/cmake/Boost-1.71.0/BoostConfig.cmake (found suitable version "1.71.0", minimum required is "1.71.0") found components: date_time program_options filesystem system regex thread unit_test_framework 
-- Found Volk: Volk::volk  
-- User set python executable /usr/bin/python3
-- Found PythonInterp: /usr/bin/python3 (found version "3.8.10") 
-- Found PythonLibs: /usr/lib/x86_64-linux-gnu/libpython3.8.so (found suitable exact version "3.8.10") 
-- Found Git: /bin/git  
-- Could NOT find Doxygen (missing: DOXYGEN_EXECUTABLE) 
-- Found PythonLibs: /usr/lib/x86_64-linux-gnu/libpython3.8.so
-- Using install prefix: /usr/local
-- Building for version: v1.0-compat-xxx-xunknown / 1.0.0git
-- No C++ unit tests... skipping
-- Could NOT find Doxygen (missing: DOXYGEN_EXECUTABLE) 
-- PYTHON and GRC components are enabled
-- Python checking for pygccxml - not found
-- Performing Test HAS_FLTO
-- Performing Test HAS_FLTO - Success
-- LTO enabled
-- Configuring done
CMake Error in lib/CMakeLists.txt:
  Imported target "gnuradio::gnuradio-runtime" includes non-existent path

    "/include"

  in its INTERFACE_INCLUDE_DIRECTORIES.  Possible reasons include:

  * The path was deleted, renamed, or moved to another location.

  * An install or uninstall procedure did not complete successfully.

  * The installation package was faulty and references files it does not
  provide.



CMake Error in lib/CMakeLists.txt:
  Imported target "gnuradio::gnuradio-runtime" includes non-existent path

    "/include"

  in its INTERFACE_INCLUDE_DIRECTORIES.  Possible reasons include:

  * The path was deleted, renamed, or moved to another location.

  * An install or uninstall procedure did not complete successfully.

  * The installation package was faulty and references files it does not
  provide.



CMake Error in python/bindings/CMakeLists.txt:
  Imported target "Boost::date_time" includes non-existent path

    "/include"

  in its INTERFACE_INCLUDE_DIRECTORIES.  Possible reasons include:

  * The path was deleted, renamed, or moved to another location.

  * An install or uninstall procedure did not complete successfully.

  * The installation package was faulty and references files it does not
  provide.



CMake Error in python/bindings/CMakeLists.txt:
  Imported target "Boost::date_time" includes non-existent path

    "/include"

  in its INTERFACE_INCLUDE_DIRECTORIES.  Possible reasons include:

  * The path was deleted, renamed, or moved to another location.

  * An install or uninstall procedure did not complete successfully.

  * The installation package was faulty and references files it does not
  provide.



-- Generating done
CMake Generate step failed.  Build files cannot be regenerated correctly.

I would like to be able to build the OOT modules with the default VM image. Am I doing something wrong?

CMake configuration is different when running from SSH vs in RDP session

Depending on the environment used to run cmake when building OOT modules in the Developer VM, python files may be installed in different directories.

The build process should produce consistent results regardless of if a user is running CMAKE from a direct SSH terminal in the Developer VM or in a terminal inside a remote desktop session.

AB#10541

test-signal.dat file not creating on sink blob

test-signal.dat file not creating on sink blob. I configured all these thing whatever you guys suggested. also using CLI authenticate.

Following
Quickstart: Key Vault with Role Based Access Controls and Azure CLI Credentials.
Quickstart: Running the Blob Source and Sink blocks with az login.

DIFI Blocks documentation not showing up in docs tab

Python block's docs look fine but the C++ blocks (currently just the 2 DIFI blocks) don't have their docstrings showing up in GRC in the documentation tab. Seems like the docs are in the right spot in the header file, using the proper doxygen tags, so might be a cmake issue? Both the dev VM and my local install demonstrate this issue.

image

AB#10543

Build fails on fresh Ubuntu 20.04, missing liborc-0.4.so

I am following the README in gr-azure-software-radio on a fresh Ubuntu 20.04 VM.

Here is the output of cmake ..:

-- The CXX compiler identification is GNU 9.3.0
-- The C compiler identification is GNU 9.3.0
-- Check for working CXX compiler: /usr/bin/c++
-- Check for working CXX compiler: /usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Check for working C compiler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Build type not specified: defaulting to release.
-- Found PkgConfig: /usr/bin/pkg-config (found version "0.29.1") 
-- Found LOG4CPP: /usr/lib/x86_64-linux-gnu/liblog4cpp.so
-- Checking for module 'gmp'
--   Found gmp, version 6.2.0
-- Found GMP: /usr/lib/x86_64-linux-gnu/libgmpxx.so  
-- Using GMP.
-- Found MPLIB: /usr/lib/x86_64-linux-gnu/libgmpxx.so  
-- Found Boost: /usr/lib/x86_64-linux-gnu/cmake/Boost-1.71.0/BoostConfig.cmake (found suitable version "1.71.0", minimum required is "1.71.0") found components: date_time program_options filesystem system regex thread unit_test_framework 
-- Found VOLK: Volk::volk  
-- User set python executable /usr/bin/python3
-- Found PythonInterp: /usr/bin/python3 (found version "3.8.10") 
-- Found PythonLibs: /usr/lib/x86_64-linux-gnu/libpython3.8.so (found suitable exact version "3.8.10") 
-- Found Git: /usr/bin/git  
-- Could NOT find Doxygen (missing: DOXYGEN_EXECUTABLE) 
-- Found PythonLibs: /usr/lib/x86_64-linux-gnu/libpython3.8.so
-- Using install prefix: /usr/local
-- Building for version: v1.0-compat-xxx-xunknown / 1.0.0git
-- No C++ unit tests... skipping
-- Could NOT find Doxygen (missing: DOXYGEN_EXECUTABLE) 
-- PYTHON and GRC components are enabled
-- Python checking for pygccxml - not found
-- Performing Test HAS_FLTO
-- Performing Test HAS_FLTO - Success
-- LTO enabled
-- Configuring done
-- Generating done
-- Build files have been written to: /home/gustavo/code/azure-software-radio/gr-azure-software-radio/build

And here is the output of make:

Scanning dependencies of target gnuradio-azure_software_radio
[  8%] Building CXX object lib/CMakeFiles/gnuradio-azure_software_radio.dir/difi_source_cpp_impl.cc.o
[ 16%] Building CXX object lib/CMakeFiles/gnuradio-azure_software_radio.dir/difi_sink_cpp_impl.cc.o
make[2]: *** No rule to make target '/usr/lib/x86_64-linux-gnu/liborc-0.4.so', needed by 'lib/libgnuradio-azure_software_radio.so.v1.0-compat-xxx-xunknown'.  Stop.
make[1]: *** [CMakeFiles/Makefile2:251: lib/CMakeFiles/gnuradio-azure_software_radio.dir/all] Error 2
make: *** [Makefile:141: all] Error 2

Then I googled around a bit and found that this means that I'm missing liborc-0.4.so. I was able to fix this with sudo apt install liborc-0.4-dev, then I was able to install the modules and use them in gnuradio-companion, but is this the right thing to do? If so, I think the README should be updated to call out this dependency on liborc.

Can't find the ADS-B Framer GRC Blocks

Possible Bug: I-Short handling in blob sink block

In the Blob Sink yaml here https://github.com/microsoft/azure-software-radio/blob/main/gr-azure-software-radio/grc/azure_software_radio_blob_sink.block.yml#L29 I'm not sure we are dealing with i-shorts correctly, it shows it as using int32 datatype, but i-shorts usually use int16 and then you just have to keep track separately that they are interleaved IQ instead of just reals. That's why when you read in a file that is i-shorts, you read it in as real shorts but then you have to use the i-short-to-complex block. No one really uses int32's themselves (and I don't know of any blocks that use them), so we just need to make sure we support complex float32, float32, int16, complex int16s, uint8's, and maybe complex uint8s. That will cover like 99.99% of use-cases. When you give the Blob Sink the data, there's no difference between complex and real for the int16s and uint8s (chars), you would save them to a binary file the same way, although now that we have SigMF support there is a reason to keep track, perhaps with a "is complex" block param, so that the datatype in the SigMF meta file is accurate.

I think we have to change it to use int16 for the "sc16" type, but then we also need to add a way to distinguish between sc16 and normal int16 other than just using dtype which will now be the same for both, so perhaps a boolean is_complex or something similar that the user can check if they have i-shorts being fed into the block, that way the SigMF side knows how to label it.

AB#11258

DefaultAzureCredential Authentication Order

Background:
During integration testing, errors came up from "integration_blob_common.py" concerning blob permissions:

azure.core.exceptions.HttpResponseError: This request is not authorized to perform this operation using this permission.

The current documentation recommends using the command "az login" (az cli) to prevent this issue, but after multiple tries, it was discovered the test was using the "Managed Identity", which didn't have any permissions to the Storage Account. From reviewing the documentation, the behavior is consistent with the order defined here.

Workaround:
As part of our test, we found out that granting the VM (managed identity) access to the Storage account and the Keyvault allowed the integration tests to pass and finish.

Next steps:
The development group needs to issue a recommendation on credentials. Most of the content mentions "az cli" , but given that we have also asked them to assign a managed identity during the VM creation (picture below), the auth order will not allow the "Az cli" credentials to take effect.

On the other hand, we can include additional comments to remind the user to grant access to the Manage Identity, and delete references to "az cli"

image

AB#10539

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.