package
stringlengths
1
122
pacakge-description
stringlengths
0
1.3M
adafruit-circuitpython-stmpe610
IntroductionAdafruit CircuitPython module for the STMPE610 Resistive Touch Screen ControllerDependenciesThis driver depends on:Adafruit CircuitPythonBus DeviceRegisterPlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-stmpe610To install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-stmpe610To install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-stmpe610Usage ExampleSee examples in github repository:https://github.com/adafruit/Adafruit_CircuitPython_STMPE610/tree/master/examplesDocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-tc74
IntroductionCircuitPython library for the Microchip TC74 Digital Temperature SensorDependenciesThis driver depends on:Adafruit CircuitPythonBus DeviceRegisterPlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-tc74To install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-tc74To install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-tc74Usage Exampleimporttimeimportboardimportadafruit_tc74i2c=board.I2C()tc=adafruit_tc74.TC74(i2c)whileTrue:print(f"Temperature:{tc.temperature}C")time.sleep(0.5)DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-tca8418
IntroductionCircuitPython / Python library for TCA8418 Keyboard MultiplexorDependenciesThis driver depends on:Adafruit CircuitPythonBus DeviceRegisterPlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundleor individual libraries can be installed usingcircup.Works with the TCA8418 keyboard multiplexor.Purchase one from the Adafruit shopInstalling from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-tca8418To install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-tca8418To install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.envsource.env/bin/activatepip3installadafruit-circuitpython-tca8418Installing to a Connected CircuitPython Device with CircupMake sure that you havecircupinstalled in your Python environment. Install it with the following command if necessary:pip3installcircupWithcircupinstalled and your CircuitPython device connected use the following command to install:circupinstalltca8418Or the following command to update an existing version:circupupdateUsage ExampleSee theexamples/folder for usage examples.DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-tca9548a
IntroductionCircuitPython driver for the TCA9548A I2C Multiplexer.DependenciesThis driver depends on:Adafruit CircuitPythonBus DevicePlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-tca9548aTo install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-tca9548aTo install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-tca9548aUsage Example# This example shows using TCA9548A to perform a simple scan for connected devicesimportboardimportadafruit_tca9548a# Create I2C bus as normali2c=board.I2C()# uses board.SCL and board.SDA# Create the TCA9548A object and give it the I2C bustca=adafruit_tca9548a.TCA9548A(i2c)forchannelinrange(8):iftca[channel].try_lock():print("Channel{}:".format(channel),end="")addresses=tca[channel].scan()print([hex(address)foraddressinaddressesifaddress!=0x70])tca[channel].unlock()DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-tcs34725
IntroductionCircuitPython module for the TCS34725 color sensor.DependenciesThis driver depends on:Adafruit CircuitPythonBus DevicePlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-tcs34725To install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-tcs34725To install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-tcs34725Usage ExampleSee examples/tcs34725_simpletest.py for an example of the module’s usage.DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-templateengine
IntroductionTemplating engine to substitute variables into a template string. Templates can also include conditional logic and loops. Often used for web pages.Library is highly inspired by theJinja2andDjango Templates, but it does not implement all of their features and takes a different approach to some of them.Main diffrences from Jinja2 and Django Templates:filters are not supported, and there is no plan to support themall variables passed inside context must be accessed using thecontextobjectyou can call methods inside templates just like in Pythonno support for nested blocks, although inheritance is supportedno support for custom tagsDependenciesThis driver depends on:Adafruit CircuitPythonPlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundleor individual libraries can be installed usingcircup.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-templateengineTo install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-templateengineTo install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.env/bin/activatepip3installadafruit-circuitpython-templateengineInstalling to a Connected CircuitPython Device with CircupMake sure that you havecircupinstalled in your Python environment. Install it with the following command if necessary:pip3installcircupWithcircupinstalled and your CircuitPython device connected use the following command to install:circupinstalladafruit_templateengineOr the following command to update an existing version:circupupdateUsage ExampleSee the simpletest for an example of how to use it..DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-testrepo
IntroductionThis repo exists solely for test purposes!DependenciesThis driver depends on:Adafruit CircuitPythonPlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Usage ExampleRepo is for test purposes only.DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-tfmini
IntroductionA CircuitPython/Python library for Benewake’s TF mini distance sensorDependenciesThis driver depends on:Adafruit CircuitPythonPlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-tfminiTo install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-tfminiTo install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-tfminiUsage Exampleimporttimeimportboard# comment this out if using pyserialimportbusio# comment this out if using pyserialimportadafruit_tfmini# Use hardware uartuart=busio.UART(board.TX,board.RX)# Or, you can use pyserial on any computer#import serial#uart = serial.Serial("/dev/ttyS2", timeout=1)# Simplest use, connect with the uart bus objecttfmini=adafruit_tfmini.TFmini(uart)# You can put in 'short' or 'long' distance modetfmini.mode=adafruit_tfmini.MODE_SHORTprint("Now in mode",tfmini.mode)whileTrue:print("Distance:%dcm (strength%d, mode%x)"%(tfmini.distance,tfmini.strength,tfmini.mode))time.sleep(0.1)DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-thermal-printer
IntroductionCircuitPython module for control of various small serial thermal printers.DependenciesThis driver depends on:Adafruit CircuitPythonPlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-thermal_printerTo install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-thermal_printerTo install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-thermal_printerUsage ExampleSee examples/thermal_printer_simpletest.py for a demo of basic printer usage.DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-thermistor
IntroductionThermistors are resistors that predictably change resistance with temperature. This driver uses an analog reading and math to determine the temperature. They are commonly used as a low cost way to measure temperature.DependenciesThis driver depends on:Adafruit CircuitPythonPlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-thermistorTo install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-thermistorTo install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-thermistorUsage ExampleThe hardest part of using the driver is its initialization. Here is an example for the thermistor on the Circuit Playground and Circuit Playground Express. Its a 10k series resistor, 10k nominal resistance, 25 celsius nominal temperature and 3950 B coefficient.DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-ticks
IntroductionWork with intervals and deadlines in millisecondsDependenciesThis driver depends on:Adafruit CircuitPythonPlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundleor individual libraries can be installed usingcircup.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-ticksTo install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-ticksTo install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-ticksInstalling to a Connected CircuitPython Device with CircupMake sure that you havecircupinstalled in your Python environment. Install it with the following command if necessary:pip3installcircupWithcircupinstalled and your CircuitPython device connected use the following command to install:circupinstalladafruit_ticksOr the following command to update an existing version:circupupdateUsage Examplefromadafruit_ticksimportticks_ms,ticks_add,ticks_less# Wait for 100ms to passdeadline=ticks_add(ticks_ms(),100)whileticks_less(ticks_ms(),deadline):passDocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-tinylora
IntroductionWARNING: This library is not compatible with The Things Network v3 stack. Since TTN has fully migrated to v3, this library is not longer able to communicate with TTN.LoRaWAN/The Things Network V2, for CircuitPython.DependenciesThis driver depends on:Adafruit CircuitPythonBus DevicePlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-tinyloraTo install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-tinyloraTo install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-tinyloraUsage ExampleUsage is described in thelearn guide for this library.DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.LicenseThis library was written by ClemensRiederer. We’ve converted it to work with Adafruit CircuitPython and made changes so it works with the Raspberry Pi and Adafruit Feather M0/M4. We’ve added examples for using this library to transmit data and sensor data to The Things Network.This open source code is licensed under the LGPL license (see LICENSE for details).
adafruit-circuitpython-tla202x
IntroductionLibrary for the TI TLA202x 12-bit ADCsDependenciesThis driver depends on:Adafruit CircuitPythonBus DeviceRegisterPlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-tla202xTo install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-tla202xTo install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-tla202xUsage Exampleimportboardimportbusiofromadafruit_tla202ximportTLA2024i2c=busio.I2c(board.SCL,board.SDA)tla=TLA2024(i2c)forchannelinrange(4):tla.input_channel=channelprint("Channel%d:%2fV"%(channel,tla.voltage))DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-tlc5947
IntroductionCircuitPython module for the TLC5947 12-bit 24 channel LED PWM driver.DependenciesThis driver depends on:Adafruit CircuitPythonPlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-tlc5947To install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-tlc5947To install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-tlc5947Usage ExampleSee examples/tlc5947_simpletest.py for a demo of the usage.See examples/tlc5947_chain.py for a demo of chained driver usage.DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-tlc59711
IntroductionCircuitPython module for the TLC59711 16-bit 12 channel LED PWM driver.DependenciesThis driver depends on:Adafruit CircuitPythonPlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-tlc59711To install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-tlc59711To install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-tlc59711Usage ExampleSee examples/tlc59711_simpletest.py for a demo of the usage.DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-tlv493d
IntroductionCircuitPython helper library for the TLV493D 3-axis magnetometerDependenciesThis driver depends on:Adafruit CircuitPythonBus DeviceRegisterPlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-tlv493dTo install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-tlv493dTo install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-tlv493dUsage Exampleimporttimeimportboardimportadafruit_tlv493di2c=board.I2C()# uses board.SCL and board.SDAtlv=adafruit_tlv493d.TLV493D(i2c)whileTrue:print("X:%s, Y:%s, Z:%suT"%tlv.magnetic)time.sleep(1)DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-tmp006
IntroductionCircuitPython driver for the TMP006 contactless IR thermometer.DependenciesThis driver depends on:Adafruit CircuitPythonBus DevicePlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-tmp006To install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-tmp006To install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-tmp006Usage ExampleEnsure your device works with the simple test in the examples folder.DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-tmp007
IntroductionCircuitPython driver for the TMP007 contactless IR thermometerDependenciesThis driver depends on:Adafruit CircuitPythonBus DevicePlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-tmp007To install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-tmp007To install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-tmp007Usage ExampleDocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-tmp117
IntroductionCircuitPython library for the TI TMP117 Temperature sensorWARNING: Library may not run on some boards with less RAM such as boards using the SAMD21DependenciesThis driver depends on:Adafruit CircuitPythonBus DeviceRegisterPlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-tmp117To install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-tmp117To install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-tmp117Usage Exampleimporttimeimportboardimportadafruit_tmp117i2c=board.I2C()# uses board.SCL and board.SDAtmp117=adafruit_tmp117.TMP117(i2c)whileTrue:print("Temperature:%.2fdegrees C"%tmp117.temperature)time.sleep(1)DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-touchscreen
IntroductionCircuitPython library for 4-wire resistive touchscreensDependenciesThis driver depends on:Adafruit CircuitPythonPlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-touchscreenTo install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-touchscreenTo install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-touchscreenUsage Exampleimportboardimportadafruit_touchscreen# These pins are used as both analog and digital!# XR, XL and YU must be analog and digital capable.# YD just needs to be digital.ts=adafruit_touchscreen.Touchscreen(board.TOUCH_XL,board.TOUCH_XR,board.TOUCH_YD,board.TOUCH_YU)whileTrue:p=ts.touch_pointifp:print(p)DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-tpa2016
IntroductionCircuitPython driver for TPA2016 Class D Amplifier.DependenciesThis driver depends on:Adafruit CircuitPythonBus DeviceRegisterPlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-tpa2016To install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-tpa2016To install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-tpa2016Usage Exampleimportadafruit_tpa2016importbusioimportboardi2c=busio.I2C(board.SCL,board.SDA)tpa=adafruit_tpa2016.TPA2016(i2c)tpa.fixed_gain=-16DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-trellis
IntroductionThis library will allow you to control the LEDs and read button presses on theAdafruit Trellis Board. It will work with a single Trellis board, or with a matrix of up to 8 Trellis boards.For more details, see theAdafruit Trellis Learn Guide.DependenciesThis driver depends on:Adafruit CircuitPython 2.0.0+Bus DevicePlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-trellisTo install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-trellisTo install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-trellisUsage ExampleSeeexamples/trellis_simpletest.pyfor full usage example.importtimeimportbusiofromboardimportSCL,SDAfromadafruit_trellisimportTrellis# Create the I2C interfacei2c=busio.I2C(SCL,SDA)# Create a Trellis object for each boardtrellis=Trellis(i2c)# 0x70 when no I2C address is supplied# Turn on every LEDprint('Turning all LEDs on...')trellis.led.fill(True)time.sleep(2)# Turn off every LEDprint('Turning all LEDs off...')trellis.led.fill(False)time.sleep(2)DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-trellism4
IntroductionThis high level library provides objects that represent Trellis M4 hardware.DependenciesThis driver depends on:Adafruit CircuitPythonPlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-trellism4To install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-trellism4To install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-trellism4Usage ExampleThis example prints out the coordinates of a button each time it is pressed and released:importtimeimportadafruit_trellism4trellis=adafruit_trellism4.TrellisM4Express()current_press=set()whileTrue:pressed=set(trellis.pressed_keys)forpressinpressed-current_press:print("Pressed:",press)forreleaseincurrent_press-pressed:print("Released:",release)time.sleep(0.08)current_press=pressedDocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-tsc2007
IntroductionPython library for TSC2007 resistive touch screen driverDependenciesThis driver depends on:Adafruit CircuitPythonBus DevicePlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundleor individual libraries can be installed usingcircup.Works with the Adafruit TSC2007 resistive touch driver.Purchase one from the Adafruit shopInstalling from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-tsc2007To install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-tsc2007To install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-tsc2007Installing to a Connected CircuitPython Device with CircupMake sure that you havecircupinstalled in your Python environment. Install it with the following command if necessary:pip3installcircupWithcircupinstalled and your CircuitPython device connected use the following command to install:circupinstalladafruit_tsc2007Or the following command to update an existing version:circupupdateUsage Exampleimportboardimportadafruit_tsc2007# Use for I2Ci2c=board.I2C()irq_dio=None# don't use an irq pin by default# uncomment for optional irq input pin so we don't continuously poll the I2C for touches# irq_dio = digitalio.DigitalInOut(board.A0)tsc=adafruit_tsc2007.TSC2007(i2c,irq=irq_dio)whileTrue:iftsc.touched:point=tsc.touchifpoint["pressure"]<100:# ignore touches with no 'pressure' as falsecontinueprint("Touchpoint: (%d,%d,%d)"%(point["x"],point["y"],point["pressure"]))DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-tsl2561
IntroductionCircuitPython driver for TSL2561 Light Sensor.DependenciesThis driver depends on:Adafruit CircuitPythonBus DevicePlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-tsl2561To install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-tsl2561To install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-tsl2561Usage Example>>>importboard>>>importbusio>>>i2c=busio.I2C(board.SCL,board.SDA)>>>importadafruit_tsl2561>>>tsl=adafruit_tsl2561.TSL2561(i2c)>>>tsl.lux3294.37DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-tsl2591
IntroductionCircuitPython module for the TSL2591 high precision light sensor.DependenciesThis driver depends on:Adafruit CircuitPythonBus DevicePlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-tsl2591To install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-tsl2591To install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-tsl2591Usage ExampleSee examples/tsl2591_simpletest.py for a demo of the usage.DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-tt21100
IntroductionBasic driver for TT21100 touchscreen drivers. The ESP32-S3 Box dev board uses it.DependenciesThis driver depends on:Adafruit CircuitPythonBus DevicePlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundleor individual libraries can be installed usingcircup.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-tt21100To install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-tt21100To install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-tt21100Installing to a Connected CircuitPython Device with CircupMake sure that you havecircupinstalled in your Python environment. Install it with the following command if necessary:pip3installcircupWithcircupinstalled and your CircuitPython device connected use the following command to install:circupinstalltt21100Or the following command to update an existing version:circupupdateUsage Exampleimporttimeimportbusioimportboardimportadafruit_tt21100# Create library object (named "tt") using a Bus I2C porti2c=busio.I2C(board.SCL,board.SDA)tt=adafruit_tt21100.TT21100(i2c)whileTrue:# if the screen is being touched print the touchesiftt.touched:print(tt.touches)time.sleep(0.15)DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-turtle
IntroductionTurtle graphics library for CircuitPython and displayioDependenciesThis driver depends on:Adafruit CircuitPythonBus DevicePlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-turtleTo install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-turtleTo install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-turtleUsage Exampleimportboardfromadafruit_turtleimportColor,turtleturtle=turtle(board.DISPLAY)starsize=min(board.DISPLAY.width,board.DISPLAY.height)*0.9# 90% of screensizeprint("Turtle time! Lets draw a star")turtle.pencolor(Color.BLUE)turtle.penup()turtle.goto(-starsize/2,0)turtle.pendown()start=turtle.pos()whileTrue:turtle.forward(starsize)turtle.left(170)ifabs(turtle.pos()-start)<1:breakwhileTrue:passDocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-typing
IntroductionDefinitions not in the standardtypingmodule that are needed for type annotation of CircuitPython code.This library is not needed at runtime for CircuitPython code, and does not need to be in the bundle.DependenciesInstalling from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-typingTo install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-typingTo install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-typingDocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-uc8151d
IntroductionCircuitPythondisplayiodriver for US8151D-based ePaper displaysDependenciesThis driver depends on:Adafruit CircuitPythonPlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundleor individual libraries can be installed usingcircup.Adafruit 2.9” Flexible 296x128 Monochrome eInk / ePaper DisplayPurchase one from the Adafruit shopInstalling from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-uc8151dTo install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-uc8151dTo install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-uc8151dInstalling to a Connected CircuitPython Device with CircupMake sure that you havecircupinstalled in your Python environment. Install it with the following command if necessary:pip3installcircupWithcircupinstalled and your CircuitPython device connected use the following command to install:circupinstalladafruit_uc8151dOr the following command to update an existing version:circupupdateUsage Exampleimporttimeimportboardimportdisplayioimportfourwireimportadafruit_uc8151ddisplayio.release_displays()# This pinout works on a Feather M4 and may need to be altered for other boards.spi=board.SPI()# Uses SCK and MOSIepd_cs=board.D9epd_dc=board.D10epd_reset=board.D5epd_busy=Nonedisplay_bus=fourwire.FourWire(spi,command=epd_dc,chip_select=epd_cs,reset=epd_reset,baudrate=1000000)time.sleep(1)display=adafruit_uc8151d.UC8151D(display_bus,width=296,height=128,rotation=90,busy_pin=epd_busy)g=displayio.Group()withopen("/display-ruler.bmp","rb")asf:pic=displayio.OnDiskBitmap(f)t=displayio.TileGrid(pic,pixel_shader=pic.pixel_shader)g.append(t)display.root_group=gdisplay.refresh()time.sleep(120)DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-us100
IntroductionCircuitPython library for reading distance and temperature via US-100 ultra-sonic sensorDependenciesThis driver depends on:Adafruit CircuitPythonPlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-us100To install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-us100To install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-us100Usage Exampleimporttimeimportboardimportbusioimportadafruit_us100uart=busio.UART(board.TX,board.RX,baudrate=9600)# Create a US-100 module instance.us100=adafruit_us100.US100(uart)whileTrue:print("-----")print(f"Temperature:{us100.temperature}°C")print(f"Distance:{us100.distance}cm")time.sleep(0.5)DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-usb-host-descriptors
IntroductionHelpers for getting USB descriptorsDependenciesThis driver depends on:Adafruit CircuitPython 9+Please ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundleor individual libraries can be installed usingcircup.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-usb-host-descriptorsTo install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-usb-host-descriptorsTo install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.env/bin/activatepip3installadafruit-circuitpython-usb-host-descriptorsInstalling to a Connected CircuitPython Device with CircupMake sure that you havecircupinstalled in your Python environment. Install it with the following command if necessary:pip3installcircupWithcircupinstalled and your CircuitPython device connected use the following command to install:circupinstalladafruit_usb_host_descriptorsOr the following command to update an existing version:circupupdateUsage ExamplePrint basic information about a device and its first (and usually only) configuration.# SPDX-FileCopyrightText: 2017 Scott Shawcroft, written for Adafruit Industries# SPDX-FileCopyrightText: Copyright (c) 2023 Scott Shawcroft for Adafruit Industries## SPDX-License-Identifier: Unlicenseimporttimeimportusb.coreimportadafruit_usb_host_descriptorsDIR_IN=0x80whileTrue:print("searching for devices")fordeviceinusb.core.find(find_all=True):print("pid",hex(device.idProduct))print("vid",hex(device.idVendor))print("man",device.manufacturer)print("product",device.product)print("serial",device.serial_number)print("config[0]:")config_descriptor=adafruit_usb_host_descriptors.get_configuration_descriptor(device,0)i=0whilei<len(config_descriptor):descriptor_len=config_descriptor[i]descriptor_type=config_descriptor[i+1]ifdescriptor_type==adafruit_usb_host_descriptors.DESC_CONFIGURATION:config_value=config_descriptor[i+5]print(f" value{config_value:d}")elifdescriptor_type==adafruit_usb_host_descriptors.DESC_INTERFACE:interface_number=config_descriptor[i+2]interface_class=config_descriptor[i+5]interface_subclass=config_descriptor[i+6]print(f" interface[{interface_number:d}]")print(f" class{interface_class:02x}subclass{interface_subclass:02x}")elifdescriptor_type==adafruit_usb_host_descriptors.DESC_ENDPOINT:endpoint_address=config_descriptor[i+2]ifendpoint_address&DIR_IN:print(f" IN{endpoint_address:02x}")else:print(f" OUT{endpoint_address:02x}")i+=descriptor_lenprint()time.sleep(5)DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-vc0706
IntroductionCircuitPython module for use with theVC0706 serial TTL camera. Allows basic image capture and download of image data from the camera over a serial connection. See examples for demo of saving image to a SD card (must be wired up separately) or internally.DependenciesThis driver depends on:Adafruit CircuitPythonPlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-vc0706To install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-vc0706To install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-vc0706Usage ExampleSee examples/snapshot.py.DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-vcnl4010
IntroductionCircuitPython module for the VCNL4010 proximity and light sensor.DependenciesThis driver depends on:Adafruit CircuitPythonBus DevicePlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-vcnl4010To install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-vcnl4010To install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-vcnl4010Usage ExampleSee examples/vcnl4010_simpletest.py for an example of the usage.DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-vcnl4020
IntroductionDriver for the VCNL4020 proximity and light sensorDependenciesThis driver depends on:Adafruit CircuitPythonBus DeviceRegisterPlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundleor individual libraries can be installed usingcircup.Purchase one from the Adafruit shopInstalling from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-vcnl4020To install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-vcnl4020To install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.env/bin/activatepip3installadafruit-circuitpython-vcnl4020Installing to a Connected CircuitPython Device with CircupMake sure that you havecircupinstalled in your Python environment. Install it with the following command if necessary:pip3installcircupWithcircupinstalled and your CircuitPython device connected use the following command to install:circupinstalladafruit_vcnl4020Or the following command to update an existing version:circupupdateUsage Exampleimporttimeimportboardimportadafruit_vcnl4020i2c=board.I2C()# Initialize VCNL4020sensor=adafruit_vcnl4020.Adafruit_VCNL4020(i2c)whileTrue:print(f"Proximity is:{sensor.proximity}")print(f"Ambient is:{sensor.lux}")time.sleep(0.1)DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-vcnl4040
IntroductionA CircuitPython library for the VCNL4040 proximity and ambient light sensor.DependenciesThis driver depends on:Adafruit CircuitPythonBus DeviceRegisterPlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-vcnl4040To install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-vcnl4040To install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-vcnl4040Usage Exampleimporttimeimportboardimportadafruit_vcnl4040i2c=board.I2C()sensor=adafruit_vcnl4040.VCNL4040(i2c)whileTrue:print("Proximity:",sensor.proximity)print("Light:",sensor.light)print("White:",sensor.white)time.sleep(0.3)DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-veml6070
IntroductionCircuitPython driver for theVEML6070 UV Index Sensor BreakoutDependenciesThis driver depends on:Adafruit CircuitPythonBus DevicePlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-veml6070To install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-veml6070To install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-veml6070Usage Exampleimporttimeimportboardfromadafruit_veml6070importVEML6070withboard.I2C()asi2c:uv=VEML6070(i2c)# Alternative constructors with parameters#uv = VEML6070(i2c, 'VEML6070_1_T')#uv = VEML6070(i2c, 'VEML6070_HALF_T', True)# take 10 readingsforjinrange(10):uv_raw=uv.uv_rawrisk_level=uv.get_index(uv_raw)print('Reading:{0}| Risk Level:{1}'.format(uv_raw,risk_level))time.sleep(1)DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-veml6075
IntroductionCircuitPython library to support VEML6075 UVA & UVB sensor.DependenciesThis driver depends on:Adafruit CircuitPythonBus DevicePlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-veml6075To install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-veml6075To install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-veml6075Usage Exampleimporttimeimportboardimportbusioimportadafruit_veml6075i2c=busio.I2C(board.SCL,board.SDA)veml=adafruit_veml6075.VEML6075(i2c,integration_time=100)whileTrue:print(veml.uv_index)time.sleep(1)DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-veml7700
IntroductionCircuitPython driver for VEML7700 high precision I2C ambient light sensor.DependenciesThis driver depends on:Adafruit CircuitPythonBus DeviceRegisterPlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-veml7700To install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-veml7700To install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-veml7700Usage Exampleimporttimeimportboardimportadafruit_veml7700i2c=board.I2C()# uses board.SCL and board.SDAveml7700=adafruit_veml7700.VEML7700(i2c)whileTrue:print("Ambient light:",veml7700.light)time.sleep(0.1)DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-vl53l0x
IntroductionCircuitPython driver for the VL53L0X distance sensor.DependenciesThis driver depends on:Adafruit CircuitPythonBus DevicePlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-vl53l0xTo install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-vl53l0xTo install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-vl53l0xUsage ExampleSee usage in the examples/vl53l0x_simpletest.py file.DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-vl53l1x
IntroductionCircuitPython module for interacting with the VL53L1X distance sensor.DependenciesThis driver depends on:Adafruit CircuitPythonBus DevicePlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundleor individual libraries can be installed usingcircup.Adafruit VL53L1X Time of Flight Distance Sensor - ~30 to 4000mm - STEMMA QT / QwiicPurchase one from the Adafruit shopInstalling from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-vl53l1xTo install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-vl53l1xTo install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-vl53l1xInstalling to a Connected CircuitPython Device with CircupMake sure that you havecircupinstalled in your Python environment. Install it with the following command if necessary:pip3installcircupWithcircupinstalled and your CircuitPython device connected use the following command to install:circupinstalladafruit_vl53l1xOr the following command to update an existing version:circupupdateUsage ExampleSee examples/vl53l1x_simpletest.py for basic usage example.DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-vl53l4cd
IntroductionCircuitPython helper library for the VL53L4CD time of flight distance sensor.DependenciesThis driver depends on:Adafruit CircuitPythonBus DevicePlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundleor individual libraries can be installed usingcircup.Adafruit VL53L4CD Time of Flight Distance Sensor - ~1 to 1300mm - STEMMA QT / QwiicPurchase one from the Adafruit shopInstalling from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-vl53l4cdTo install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-vl53l4cdTo install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-vl53l4cdInstalling to a Connected CircuitPython Device with CircupMake sure that you havecircupinstalled in your Python environment. Install it with the following command if necessary:pip3installcircupWithcircupinstalled and your CircuitPython device connected use the following command to install:circupinstallvl53l4cdOr the following command to update an existing version:circupupdateUsage ExampleSee examples/vl53l4cd_simpletest.py for basic usage example.DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-vl6180x
IntroductionCircuitPython module for the VL6180X distance sensor. See examples/vl6180x_simpletest.py for a demo of the usage.DependenciesThis driver depends on:Adafruit CircuitPythonBus DevicePlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-vl6180xTo install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-vl6180xTo install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-vl6180xUsage ExampleSee examples/vl6180x_simpletest.py for a demo of the usage.DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-vs1053
IntroductionDriver for interacting and playing media files with the VS1053 audio codec over a SPI connection.NOTE: This is not currently working for audio playback of files. Only sine wave test currently works. The problem is that pure Python code is currently too slow to keep up with feeding data to the VS1053 fast enough. There’s no interrupt support so Python code has to monitor the DREQ line and provide a small buffer of data when ready, but the overhead of the interpreter means we can’t keep up. Optimizing SPI to use DMA transfers could help but ultimately an interrupt-based approach is likely what can make this work better (or C functions built in to custom builds that monitor the DREQ line and feed a buffer of data).DependenciesThis driver depends on:Adafruit CircuitPythonBus DevicePlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-vs1053To install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-vs1053To install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-vs1053Usage ExampleSee examples/sdfile_play.py.DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-wave
IntroductionRead and write standard WAV-format filesDependenciesThis driver depends on:Adafruit CircuitPythonPlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundleor individual libraries can be installed usingcircup.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-waveTo install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-waveTo install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.env/bin/activatepip3installadafruit-circuitpython-waveInstalling to a Connected CircuitPython Device with CircupMake sure that you havecircupinstalled in your Python environment. Install it with the following command if necessary:pip3installcircupWithcircupinstalled and your CircuitPython device connected use the following command to install:circupinstalladafruit_waveOr the following command to update an existing version:circupupdateUsage Exampleimportadafruit_wavewithadafruit_wave.open("sample.wav")asw:print(w.getsampwidth())print(w.getnchannels())print(list(memoryview(w.readframes(100)).cast("h")))DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-waveform
IntroductionThis library generates simple waveforms that can be used to generate different type of audio signals.DependenciesThis driver depends on:Adafruit CircuitPythonPlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-waveformTo install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-waveformTo install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-waveformUsage ExampleThis example generates one wavelength of a 440hz sine wave when played at 16 kilosamples per second:fromadafruit_waveformimportsinewave=sine.sine_wave(16000,440)DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-wii-classic
IntroductionCircuitPython library for Nintendo Wii Classic controllers.DependenciesThis driver depends on:Adafruit CircuitPythonBus DevicePlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundleor individual libraries can be installed usingcircup.Works with the Wii Nunchuck Breakout Adapter and a Wii Classic Controller.Purchase one from the Adafruit shop.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-wii-classicTo install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-wii-classicTo install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.env/bin/activatepip3installAdafruit-circuitpython-wii-classicInstalling to a Connected CircuitPython Device with CircupMake sure that you havecircupinstalled in your Python environment. Install it with the following command if necessary:pip3installcircupWithcircupinstalled and your CircuitPython device connected use the following command to install:circupinstalladafruit_wii_classicOr the following command to update an existing version:circupupdateUsage Exampleimporttimeimportboardimportadafruit_wii_classici2c=board.STEMMA_I2C()ctrl_pad=adafruit_wii_classic.Wii_Classic(i2c)whileTrue:left_x,left_y=ctrl_pad.joystick_lright_x,right_y=ctrl_pad.joystick_rleft_pressure=ctrl_pad.l_shoulder.LEFT_FORCEright_pressure=ctrl_pad.r_shoulder.RIGHT_FORCEprint("joystick_l ={},{}".format(left_x,left_y))print("joystick_r ={},{}".format(right_X,left_y))print("left shoulder ={}".format(left_pressure))print("right shoulder ={}".format(right_pressure))ifctrl_pad.buttons.A:print("button A")ifctrl_pad.buttons.B:print("button B")ifctrl_pad.d_pad.UP:print("d_pad Up")time.sleep(0.5)DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-wiznet5k
IntroductionPure-Python interface for WIZNET 5k ethernet modules.DependenciesThis driver depends on:Adafruit CircuitPythonBus DevicePlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-wiznet5kTo install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-wiznet5kTo install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-wiznet5kUsage ExampleThis example demonstrates making a HTTP GET request to wifitest.adafruit.com.importboardimportbusioimportdigitalioimportadafruit_requestsasrequestsfromadafruit_wiznet5k.adafruit_wiznet5kimportWIZNET5Kimportadafruit_wiznet5k.adafruit_wiznet5k_socketassocketprint("Wiznet5k WebClient Test")TEXT_URL="http://wifitest.adafruit.com/testwifi/index.html"JSON_URL="http://api.coindesk.com/v1/bpi/currentprice/USD.json"cs=digitalio.DigitalInOut(board.D10)spi_bus=busio.SPI(board.SCK,MOSI=board.MOSI,MISO=board.MISO)# Initialize ethernet interface with DHCPeth=WIZNET5K(spi_bus,cs)# Initialize a requests object with a socket and ethernet interfacerequests.set_socket(socket,eth)print("Chip Version:",eth.chip)print("MAC Address:",[hex(i)foriineth.mac_address])print("My IP address is:",eth.pretty_ip(eth.ip_address))print("IP lookup adafruit.com:%s"%eth.pretty_ip(eth.get_host_by_name("adafruit.com")))#eth._debug = Trueprint("Fetching text from",TEXT_URL)r=requests.get(TEXT_URL)print('-'*40)print(r.text)print('-'*40)r.close()print()print("Fetching json from",JSON_URL)r=requests.get(JSON_URL)print('-'*40)print(r.json())print('-'*40)r.close()print("Done!")This example demonstrates a simple web server that allows setting the Neopixel color.importboardimportbusioimportdigitalioimportneopixelfromadafruit_wiznet5k.adafruit_wiznet5kimportWIZNET5Kimportadafruit_wiznet5k.adafruit_wiznet5k_wsgiserverasserverfromadafruit_wsgi.wsgi_appimportWSGIAppprint("Wiznet5k Web Server Test")# Status LEDled=neopixel.NeoPixel(board.NEOPIXEL,1)led.brightness=0.3led[0]=(0,0,255)# W5500 connectionscs=digitalio.DigitalInOut(board.D10)spi_bus=busio.SPI(board.SCK,MOSI=board.MOSI,MISO=board.MISO)# Initialize Ethernet interface with DHCPeth=WIZNET5K(spi_bus,cs)# Here we create our application, registering the# following functions to be called on specific HTTP GET requests routesweb_app=WSGIApp()@web_app.route("/led/<r>/<g>/<b>")defled_on(request,r,g,b):print("LED handler")led.fill((int(r),int(g),int(b)))return("200 OK",[],["LED set!"])@web_app.route("/")defroot(request):print("Root handler")return("200 OK",[],["Root document"])@web_app.route("/large")deflarge(request):print("Large pattern handler")return("200 OK",[],["*-.-"*2000])# Here we setup our server, passing in our web_app as the applicationserver.set_interface(eth)wsgiServer=server.WSGIServer(80,application=web_app)print("Open this IP in your browser: ",eth.pretty_ip(eth.ip_address))# Start the serverwsgiServer.start()whileTrue:# Our main loop where we have the server poll for incoming requestswsgiServer.update_poll()# Maintain DHCP leaseeth.maintain_dhcp_lease()# Could do any other background tasks here, like reading sensorsDocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.LicenseThis library was written byArduino LLC. We’ve converted it to work withCircuitPythonand made changes so it works similarly toCircuitPython’s WIZNET5k wrapper for the WIZnet 5500 Ethernet interfaceand CPython’sSocket low-level networking interface module.This open source code is licensed under the LGPL license (see LICENSE for details).
adafruit-circuitpython-ws2801
IntroductionHigher level WS2801 driver that presents the LED string as a sequence. It is the same api as theNeoPixel library.Colors are stored as tuples by default. However, you can also use int hex syntax to set values similar to colors on the web. For example,0x800000(#800000on the web) is equivalent to(0x80, 0, 0).DependenciesThis driver depends on:Adafruit CircuitPythonBus DevicePlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-ws2801To install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-ws2801To install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-ws2801Usage ExampleThis example demonstrates the library drivinga strand of 25 RGB ledsby aGemma M0using the hardware SPI capable outputs.importboardimportadafruit_ws2801leds=adafruit_ws2801.WS2801(board.D2,board.D0,25)leds.fill((0x80,0,0))DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit-circuitpython-wsgi
IntroductionCircuitPython framework for creating WSGI compatible web server applications.DependenciesThis driver depends on:Adafruit CircuitPythonPlease ensure all dependencies are available on the CircuitPython filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPINoteThis library is not available on PyPI yet. Install documentation is included as a standard element. Stay tuned for PyPI availability!On supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-circuitpython-wsgiTo install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-wsgiTo install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-wsgiDocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.Sphinx documentationSphinx is used to build the documentation based on rST files and comments in the code. First, install dependencies (feel free to reuse the virtual environment from above):python3-mvenv.venvsource.venv/bin/activatepipinstallSphinxsphinx-rtd-themeNow, once you have the virtual environment activated:cddocssphinx-build-E-W-bhtml._build/htmlThis will output the documentation todocs/_build/html. Open the index.html in your browser to view them. It will also (due to -W) error out on any warning like Travis will. This is a good way to locally verify it will pass.
adafruit-extended-bus
IntroductionHelper Library for Blinka to allow creating I2C and SPI busio objects by passing in the Bus ID. This library is not compatible with CircuitPython and is intended to only be run on Linux devices.DependenciesThis driver depends on:Adafruit PythonPlease ensure all dependencies are available on the Python filesystem. This is easily achieved by downloadingthe Adafruit library and driver bundle.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-extended-busTo install system-wide (this may be required in some cases):sudopip3installadafruit-extended-busTo install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.envsource.env/bin/activatepip3installadafruit-extended-busUsage Examplefromadafruit_extended_busimportExtendedI2CasI2Cimportadafruit_bme280# Create library object using our Extended Bus I2C porti2c=I2C(1)# Device is /dev/i2c-1bme280=adafruit_bme280.Adafruit_BME280_I2C(i2c)print("\nTemperature:%0.1fC"%bme280.temperature)ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.DocumentationFor information on building library documentation, please check outthis guide.
adafruit-fingerprint
adafruit-fingerprintThis library enables you communicate and with the r305 Fingerprint Identification Module via serial connection with upper computer.The library enables you to to communicate with the r305 fingerprint module from upper computer (your laptop, a raspberry pi etc), rather than the arduino which it was built for by default. The module communicates via TTL, therefore, to communicate you need aUSB - TTL converterconnected to the module.It provides a class that exposes methods you can call to perform serial read/write actions with the module, these methods are implemented according to the datasheet specification, which you can download from the repohere.Installationpipinstalladafruit-fingerprintUsage exampleThere is a section on thedocs - hardware sectionwith pictures and explanations on how to setup the hardware needed for these examples to work.Visit theExamples Codessection on thedocsto view all examples.Development setupRequirementsPython3.x (x >= 6)piphardware setup. Seedocs - hardware section.Get startedpipinstall-rrequirements.txtRunning testspython-munittestdiscovertestsRelease History1.0.0Good enough for most use casesMetaThis package is heavily inspired by thefinger_sphinxproject, found during our search for a way to get the fingerprint template to upper computer, rather than have it stored in flash library. A very big kudos and acknowledgement to the owners.Faith Odonghanro –@toritsejuFO(twitter),toritsejuFO(github)Nwanozie Promise –@PNwanozie(twitter),iotstudent(github)Adegoke Joshua –@iAmCodedebugger(twitter),Ade-Joshe(github)Distributed under the MIT license. SeeLICENSEfor more information.Github RepositoryGithub Repository
adafruit-io
Adafruit IO PythonA Python library and examples for use withio.adafruit.com.Compatible with Python Versions 3.4+InstallationEasy InstallationIf you havePIPinstalled (typically with apt-get install python-pip on a Debian/Ubuntu-based system) then run:pip3installadafruit-ioThis will automatically install the Adafruit IO Python client code for your Python scripts to use. You might want to examine the examples folder in this GitHub repository to see examples of usage.If the above command fails, you may first need to install prerequisites:pip3installsetuptoolspip3installwheelManual InstallationClone or download the contents of this repository. Then navigate to the folder in a terminal and run the following command:pythonsetup.pyinstallUsageDocumentation for this project isavailable on the ReadTheDocs.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adafruit_lcd_plate_menu
UNKNOWN
adafruit-micropython-register
# Adafruit_MicroPython_Register Python data descriptor classes to represent hardware registers on I2C and SPI devices.
adafruit-nrfutil
adafruit-nrfutiladafruit-nrfutilis a Python package that includes theadafruit-nrfutilcommand line utility and thenordicsemilibrary.This package is derived from the Nordic Semiconductor ASA packagepc-nrfutil, version 0.5.3. The code has been converted from Python 2 to Python 3.The executablenrfutilhas been renamed toadafruit-nrfutilto distinguish it from the original executable.This tool can be used with theAdafruit nRF52 Featherto flash firmware images onto the device using the simple serial port.This library is written for Python 3.5+. It is no longer Python 2 compatible!InstallationPrerequisitesPython3pip3Run the following commands to makeadafruit-nrfutilavailable from the command line or to development platforms like the Arduino IDE or CircuitPython:Installing from PyPIThis is recommended method, to install latest version$ pip3 install --user adafruit-nrfutilInstalling from SourceUse this method if you have issue installing with PyPi or want to modify the tool. First clone this repo and go into its folder.$ git clone https://github.com/adafruit/Adafruit_nRF52_nrfutil.git $ cd Adafruit_nRF52_nrfutilNote: following commands usepython3, however if you are on Windows, you may need to change it topythonsince windows installation of python 3.x still uses the name python.exeTo install in user space in your home directory:$ pip3 install -r requirements.txt $ python3 setup.py installIf you get permission errors when runningpip3 install, yourpip3is older or is set to try to install in the system directories. In that case use the--userflag:$ pip3 install -r --user requirements.txt $ python3 setup.py installIf you want to install in system directories (generally not recommended):$ sudo pip3 install -r requirements.txt $ sudo python3 setup.py installCreate self-contained binaryTo generate a self-contained executable binary of the utility (Windows and MacOS), run these commands:pip3 install pyinstaller cd Adafruit_nRF52_nrfutil pip3 install -r requirements.txt cd Adafruit_nRF52_nrfutil\nordicsemi pyinstaller __main__.py --onefile --clean --name adafruit-nrfutilYou will find the .exe inAdafruit_nRF52_nrfutil\nordicsemi\dist\adafruit-nrfutil( with.exeif you are on windows). Copy or move it elsewhere for your convenience, such as directory in your %PATH%.UsageTo get info on the usage of adafruit-nrfutil:adafruit-nrfutil --helpTo convert an nRF52 .hex file into a DFU pkg file that the serial bootloader can make use of:adafruit-nrfutil dfu genpkg --dev-type 0x0052 --application firmware.hex dfu-package.zipTo flash a DFU pkg file over serial:adafruit-nrfutil dfu serial --package dfu-package.zip -p /dev/tty.SLAB_USBtoUART -b 115200
adafruit-python-shell
IntroductionPython helper for running Shell scripts in PythonDependenciesThis driver depends on:LinuxInstalling from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom PyPI. To install for current user:pip3installadafruit-python-shellTo install system-wide (this may be required in some cases):sudopip3installadafruit-python-shellTo install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.envsource.env/bin/activatepip3installadafruit-python-shellContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.DocumentationFor information on building library documentation, please check outthis guide.
adafruit-rplidar
IntroductionProvide an interface to the SlamTec RPLidar A1.Eventual goal is for the single module to work with both Linux (via /dev/USB0, etc) and CircuitPython (via a UART instance)Usage Exampleimport os from math import cos, sin, pi, floor import pygame from adafruit_circuitpython_rplidar import RPLidar # Set up pygame and the display os.putenv('SDL_FBDEV', '/dev/fb1') pygame.init() lcd = pygame.display.set_mode((320,240)) pygame.mouse.set_visible(False) lcd.fill((0,0,0)) pygame.display.update() # Setup the RPLidar PORT_NAME = '/dev/ttyUSB0' lidar = RPLidar(None, PORT_NAME) # used to scale data to fit on the screen max_distance = 0 def process_data(data): # Do something useful with the data pass scan_data = [0]*360 try: print(lidar.get_info()) for scan in lidar.iter_scans(): for (_, angle, distance) in scan: scan_data[min([359, floor(angle)])] = distance process_data(scan_data) except KeyboardInterrupt: print('Stoping.') lidar.stop() lidar.disconnect()ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.
adage
Adage - A DAG ExecutorThis is a small experimental package to see how one could describe workflows that are not completely known at definition time. Tasks should be runnable both in a multiprocessing pool, or using a number of celery workers or a IPython cluster.ExampleProblemWorkflows can be comfortably represented by directed acyclic graphs (DAGs). But sometimes the precise structure of the graph is not known before the processing starts. Instead often one only has partial information of what kind of edges are possible and depending on a certain result in a node the DAG might be appended with more nodes and edges.For example, one node (call it "node A") could be downloading a list of files, which can be processed in parallel. The DAG would therefore have one node for each file-processing (let's call them "node_file_1" to "node_file_n") depending on "node A". Since the exact number of files is not known until run-time, we cannot map out the DAG beforehand. Also after this "map"-step one might want to have a "reduce"-step to merge the individual result. This can also only be scheduled after the number of "map"-nodes is known.Another example is that one might have a whole set of nodes that run a certain kind of task (e.g. produce a PDF file). One could imagine wanting to have a "reduce"-type task which merges all these individual PDF files. While any given node does not know where else PDF-generating tasks are scheduled, one can wait until no edges to PDF-generating tasks are possible anymore to then append a PDF-merging node to the DAG.SolutionGenerically, we want individual nodes to have a limited set of operations they can do on the DAG that they are part of. Specifically we can only allow queries on the structure of the DAG as well as append operations, nodes must not be able to remove nodes. The way we implement this is that we have a append-only record of scheduled rules. A rule is a pair of functions (predicate,body) that operate on the DAG. The predicate is a query function that inspects the graph to decide whether the DAG has enough information to apply the body (e.g. are edges of a certain type still possible to append or not?). If the DAG does have enough information the body which is an append-only operation on the DAG is applied, i.e. nodes are added . Periodically the list of rules is iterated to extend the DAG where possible.Rules for RulesThere are a couple of rule that the rules need to obey themselves in order to makeit is the responsibility of the predicate to signal that all necessary nodes for the body are present in the DAG. Examples are:wait until no nodes of a particular type could possibly be added to the DAG. This requires us to know what kind of edges are valid on a global level.wait until a certain number of nodes are present in the DAG (say )select a certain set of nodes by their unique id (useful to attach a sub-DAG to a existing node from within that node)the only valid edges that you can dynamically add are ones that point away from existing nodes to new nodes.. edges directedtowardsexisting nodes would introduce new dependencies which were not present before and so that job might have already run, or be currently running
adagio
A-DAG-IO FrameworkJoin Fugue-Project on SlackA Dag IO framework forFugue projectsInstallationpip install adagio
adagios
No description available on PyPI.
ad-AHJZ
No description available on PyPI.
ad-AJHZ
No description available on PyPI.
adal
This library, ADAL for Python, will no longer receive new feature improvements. Instead, use the new libraryMSAL for Python.If you are starting a new project, you can get started with theMSAL Python docsfor details about the scenarios, usage, and relevant concepts.If your application is using the previous ADAL Python library, you can follow thismigration guideto update to MSAL Python.Existing applications relying on ADAL Python will continue to work.Microsoft Azure Active Directory Authentication Library (ADAL) for PythonmasterbranchdevbranchReference DocsGetting StartedDocsPython SamplesSupportFeedbackThe ADAL for Python library enables python applications to authenticate with Azure AD and get tokens to access Azure AD protected web resources.You can learn in detail about ADAL Python functionality and usage documented in theWiki.Installation and UsageYou can find the steps to install and basic usage of the library underADAL Basicspage in the Wiki.Samples and DocumentationWe provide a full suite ofPython sample applications on GitHubto help you get started with learning the Azure Identity system. This will include tutorials for native clients and web applications. We also provide full walkthroughs for authentication flows such as OAuth2, OpenID Connect and for calling APIs such as the Graph API.There are also somelightweight samples existing inside this repo.You can find the relevant samples by scenarios listed in thiswiki page for acquiring tokens using ADAL Python.The documents onAuth ScenariosandAuth protocolsare recommended reading.VersionsThis library followsSemantic Versioning.You can find the changes for each version underReleases.Community Help and SupportWe leverageStack Overflowto work with the community on supporting Azure Active Directory and its SDKs, including this one! We highly recommend you ask your questions on Stack Overflow (we're all on there!) Also browser existing issues to see if someone has had your question before.We recommend you use the "adal" tag so we can see it! Here is the latest Q&A on Stack Overflow for ADAL:https://stackoverflow.com/questions/tagged/adalSubmit FeedbackWe'd like your thoughts on this library. Please completethis short survey.Security ReportingIf you find a security issue with our libraries or services please report it [email protected] as much detail as possible. Your submission may be eligible for a bounty through theMicrosoft Bountyprogram. Please do not post security issues to GitHub Issues or any other public site. We will contact you shortly upon receiving the information. We encourage you to get notifications of when security incidents occur by visitingthis pageand subscribing to Security Advisory Alerts.ContributingAll code is licensed under the MIT license and we triage actively on GitHub. We enthusiastically welcome contributions and feedback. Please read thecontributing guidebefore starting.We Value and Adhere to the Microsoft Open Source Code of ConductThis project has adopted theMicrosoft Open Source Code of Conduct. For more information see theCode of Conduct FAQor [email protected] any additional questions or comments.
adala
Adala is anAutonomousDAta (Labeling)Agent framework.Adala offers a robust framework for implementing agents specialized in data processing, with an emphasis on diverse data labeling tasks. These agents are autonomous, meaning they can independently acquire one or more skills through iterative learning. This learning process is influenced by their operating environment, observations, and reflections. Users define the environment by providing a ground truth dataset. Every agent learns and applies its skills in what we refer to as a "runtime", synonymous with LLM.📢 Why choose Adala?🌟Reliable agents: Agents are built upon a foundation of ground truth data. This ensures consistent and trustworthy results, making Adala a reliable choice for your data processing needs.🎮Controllable output: For every skill, you can configure the desired output and set specific constraints with varying degrees of flexibility. Whether you want strict adherence to particular guidelines or more adaptive outputs based on the agent's learning, Adala allows you to tailor results to your exact needs.🎯Specialized in data processing: While agents excel in diverse data labeling tasks, they can be customized for a wide range of data processing needs.🧠Autonomous learning: Adala agents aren't just automated; they're intelligent. They iteratively and independently develop skills based on environment, observations, and reflections.✅Flexible and extensible runtime: Adala's runtime environment is adaptable. A single skill can be deployed across multiple runtimes, facilitating dynamic scenarios like the student/teacher architecture. Moreover, the openness of framework invites the community to extend and tailor runtimes, ensuring continuous evolution and adaptability to diverse needs.🚀Easily customizable: Quickly customize and develop agents to address challenges specific to your needs, without facing a steep learning curve.🫵 Who is Adala for?Adala is a versatile framework designed for individuals and professionals in the field of AI and machine learning. Here's who can benefit:🧡AI engineers:Architect and design AI agent systems with modular, interconnected skills. Build production-level agent systems, abstracting low-level ML to Adala and LLMs.💻Machine learning researchers:Experiment with complex problem decomposition and causal reasoning.📈Data scientists:Apply agents to preprocess and postprocess your data. Interact with Adala natively through Python notebooks when working with large Dataframes.🏫Educators and students:Use Adala as a teaching tool or as a base for advanced projects and research.While the roles highlighted above are central, it's pivotal to note that Adala is intricately designed to streamline and elevate the AI development journey, catering to all enthusiasts, irrespective of their specific niche in the field. 🥰🔌InstallationInstall Adala:pipinstalladalaAdala frequently releases updates. In order to ensure that you are using the most up-to-date version, it is recommended that you install it from GitHub:pipinstallgit+https://github.com/HumanSignal/Adala.git📝 PrerequisitesSet OPENAI_API_KEY (see instructions here)export OPENAI_API_KEY='your-openai-api-key'🎬 QuickstartIn this example we will use Adala as a standalone library directly inside Python notebook.Clickhereto see an extended quickstart example.importpandasaspdfromadala.agentsimportAgentfromadala.environmentsimportStaticEnvironmentfromadala.skillsimportClassificationSkillfromadala.runtimesimportOpenAIChatRuntimefromrichimportprint# Train datasettrain_df=pd.DataFrame([["It was the negative first impressions, and then it started working.","Positive"],["Not loud enough and doesn't turn on like it should.","Negative"],["I don't know what to say.","Neutral"],["Manager was rude, but the most important that mic shows very flat frequency response.","Positive"],["The phone doesn't seem to accept anything except CBR mp3s.","Negative"],["I tried it before, I bought this device for my son.","Neutral"],],columns=["text","sentiment"])# Test datasettest_df=pd.DataFrame(["All three broke within two months of use.","The device worked for a long time, can't say anything bad.","Just a random line of text."],columns=["text"])agent=Agent(# connect to a datasetenvironment=StaticEnvironment(df=train_df),# define a skillskills=ClassificationSkill(name='sentiment',instructions="Label text as positive, negative or neutral.",labels={'sentiment':["Positive","Negative","Neutral"]},input_template="Text:{text}",output_template="Sentiment:{sentiment}"),# define all the different runtimes your skills may useruntimes={# You can specify your OPENAI API KEY here via `OpenAIRuntime(..., api_key='your-api-key')`'openai':OpenAIChatRuntime(model='gpt-3.5-turbo'),},default_runtime='openai',# NOTE! If you have access to GPT-4, you can uncomment the lines bellow for better results# default_teacher_runtime='openai-gpt4',# teacher_runtimes = {# 'openai-gpt4': OpenAIRuntime(model='gpt-4')# })print(agent)print(agent.skills)agent.learn(learning_iterations=3,accuracy_threshold=0.95)print('\n=> Run tests ...')predictions=agent.run(test_df)print('\n=> Test results:')print(predictions)👉 Available skillsClassificationSkill– Classify text into a set of predefined labels.ClassificationSkillWithCoT– Classify text into a set of predefined labels, using Chain-of-Thoughts reasoning.SummarizationSkill– Summarize text into a shorter text.QuestionAnsweringSkill– Answer questions based on a given context.TranslationSkill– Translate text from one language to another.TextGenerationSkill– Generate text based on a given prompt.Skill Sets– Process complex tasks through a sequence of skills.🗺 RoadmapLow-level skill management (i.e. agent.get_skill("name")) [COMPLETE @niklub]Make every notebook example to run in Google Collab and add a badge into READMEExtend environment with one more exampleMulti-task learning (learn multiple skills at once)Calculate and store top line Agent metrics (predictions created, runtime executions, learning loops, etc)Create Named Entity Recognition SkillCommand line utility (see the source for this readme for example)REST API to interact with AdalaVision and multi-modal agent skills🤩 Contributing to AdalaEnhance skills, optimize runtimes, or pioneer new agent types. Whether you're crafting nuanced tasks, refining computational environments, or sculpting specialized agents for unique domains, your contributions will power Adala's evolution. Join us in shaping the future of intelligent systems and making Adala more versatile and impactful for users across the globe.Read morehere.💬 SupportDo you need help or are you looking to engage with community? Check outDiscord channel! Whether you have questions, need clarification, or simply want to discuss topics related to the project, the Discord community is welcoming!
adala-pk-test
Adala is anAutonomousDAta (Labeling)Agent framework.Adala offers a robust framework for implementing agents specialized in data processing, with a particular emphasis on diverse data labeling tasks. These agents are autonomous, meaning they can independently acquire one or more skills through iterative learning. This learning process is influenced by their operating environment, observations, and reflections. Users define the environment by providing a ground truth dataset. Every agent learns and applies its skills in what we refer to as a "runtime", synonymous with LLM.Why Choose Adala?Reliable Agents: Built upon a foundation of ground truth data, our agents ensure consistent and trustworthy results, making Adala a reliable choice for data processing needs.Controllable Output: For every skill, you can configure the desired output, setting specific constraints with varying degrees of flexibility. Whether you want strict adherence to particular guidelines or more adaptive outputs based on the agent's learning, Adala allows you to tailor results to your exact needs.Specialized in Data Processing: While our agents excel in diverse data labeling tasks, they can be tailored to a wide range of data processing needs.Autonomous Learning: Adala agents aren't just automated; they're intelligent. They iteratively and independently develop skills based on environment, observations, and reflections.Flexible and Extensible Runtime: Adala's runtime environment is adaptable. A single skill can be deployed across multiple runtimes, facilitating dynamic scenarios like the student/teacher architecture. Moreover, the openness of our framework invites the community to extend and tailor runtimes, ensuring continuous evolution and adaptability to diverse needs.Extend Skills: Quickly tailor and develop agents to address the specific challenges and nuances of your domain, without facing a steep learning curve.InstallationInstall ADALA:pipinstalladalaIf you're planning to use human-in-the-loop labeling, or need a labeling tool to produce ground truth datasets, we suggest installing Label Studio. Adala is made to support Label Studio format right out of the box.pipinstalllabel-studioPrerequisitesSet OPENAI_API_KEY (see instructions here)export OPENAI_API_KEY='your-openai-api-key'QuickstartIn this example we will use ADALA as a standalone library directly inside our python notebook. You can open it in Collab right here.importpandasaspdfromadala.agentsimportAgentfromadala.datasetsimportDataFrameDatasetfromadala.environmentsimportBasicEnvironmentfromadala.skillsimportClassificationSkillfromrichimportprintprint("=> Initialize datasets ...")# Train datasettrain_df=pd.DataFrame([["It was the negative first impressions, and then it started working.","Positive"],["Not loud enough and doesn't turn on like it should.","Negative"],["I don't know what to say.","Neutral"],["Manager was rude, but the most important that mic shows very flat frequency response.","Positive"],["The phone doesn't seem to accept anything except CBR mp3s.","Negative"],["I tried it before, I bought this device for my son.","Neutral"],],columns=["text","ground_truth"])# Test datasettest_df=pd.DataFrame(["All three broke within two months of use.","The device worked for a long time, can't say anything bad.","Just a random line of text.","Will order from them again!",],columns=["text"])train_dataset=DataFrameDataset(df=train_df)test_dataset=DataFrameDataset(df=test_df)print("=> Initialize and train ADALA agent ...")agent=Agent(# connect to a datasetenvironment=BasicEnvironment(ground_truth_dataset=train_dataset,ground_truth_column="ground_truth"),# define a skillskills=ClassificationSkill(name='sentiment_classification',instructions="Label text as subjective or objective.",labels=["Positive","Negative","Neutral"],input_data_field='text'),# uncomment this if you want more quality and you have access to OPENAI GPT-4 model# default_teacher_runtime='openai-gpt4',)print(agent)agent.learn(learning_iterations=3,accuracy_threshold=0.95)print(agent.skills)print('\n=> Run tests ...')run=agent.apply_skills(test_dataset)print('\n=> Test results:')print(run)More NotebooksQuickstart– An extended example of the above with comments and outputs.Creating New Skill– An example that walks you through creating a new skill.Label Studio Tutorial– An example of connecting Adala to an external labeling tool for enhanced supervision.Who Adala is for?Adala is a versatile framework designed for individuals and professionals in the field of AI and machine learning. Here's who can benefit:AI Engineers:Architect and design AI Agent systems with modular, interconnected skills. Build production-level agent systems, abstracting low-level ML to Adala and LLMs.Machine Learning Researchers:Experiment with complex problem decomposition and causal reasoning.Data Scientists:Apply agents to preprocess and postprocess your data. Interact with Adala natively through Python notebooks when working with large Dataframes.Educators and Students:Use Adala as a teaching tool or as a base for advanced projects and research.While the roles highlighted above are central, it's pivotal to note that Adala is intricately designed to streamline and elevate the AI development journey, catering to all enthusiasts, irrespective of their specific niche in the field.RoadmapCreate Named Entity Recognition SkillExtend Environemnt with one more exampleCommand Line Utility (see the source for this readme for example)REST API to interact with AdalaContributing to AdalaDive into the heart of Adala by enhancing Skills, optimizing Runtimes, or pioneering new Agent Types. Whether you're crafting nuanced tasks, refining computational environments, or sculpting specialized agents for unique domains, your contributions will power Adala's evolution. Join us in shaping the future of intelligent systems and making Adala more versatile and impactful for users across the globe.Read morehere.SupportAre you in need of assistance or looking to engage with our community? OurDiscord channelis the perfect place for real-time support and interaction. Whether you have questions, need clarifications, or simply want to discuss topics related to our project, the Discord community is welcoming!
adalib
AdalibTODO
adalitix
No description available on PyPI.
adam
TimeQLTimeQL is an experimental declarative programming language.
adamalib
Adamalibprovides a Python library/SDK for interacting withAdama. It is designed to be used as a standalone library in the user’s local machine to develop Adama microservices.InstallationUsepip:pip install git+git://github.com/Arabidopsis-Information-Portal/adamalib.gitIt’ll be moved to PyPI as soon as it reaches some stability.As an alternative, seeusing adamalib in Dockerbelow.Using adamalib in DockerThis repository includes aDockerfileand adocker-compose.ymlfile, which allows a zero installation version ofadamalib.The only requirement isDockeranddocker-compose, most likely already installed in your system.Then, clone this repository and executedocker-composeas follows:$gitclonehttps://github.com/Arabidopsis-Information-Portal/adamalib.git$cdadamalib$docker-composebuild$docker-composeup(a bug indocker-composerequires doing the stepsbuildandupseparately. In the future, onlyupwill be necessary.)Navigate tohttp://localhost:8888and access theJupyternotebook with passwordadamalib. The notebookExample.ipynbcontains a full example of use. The notebookProvenance.ipynbcontains an example of accessing provenance information from Python.Note: If you are running on a Mac withboot2docker, substitutelocalhostby the output of:$boot2dockeripLicenseFree software: MIT license
adamant
No description available on PyPI.
adamantium
No description available on PyPI.
adamapdf
This is the homepage of our project.
adamapi
InstallationVersioningadamapi==2.2.2.2, This pachage works only with ADAMCORE 2.Requirementssudoapt-getinstallpython3-venvpython3-gdalgdal-binInstall with pipVENVNAME="adamapi"python3-mvenv"${VENVNAME}"source"${VENVNAME}/bin/activate";python3-mpipinstall--upgradepip;pipinstalladamapi ln-s"/usr/lib/python3/dist-packages/osgeo""${VENVNAME}/lib/python3.8/site-packages/osgeo"API DEFINITIONSThis document briefly describes the ADMAPI functionalities.The ADAMAPI library is divided in 4 modules:Auth --> the authorization moduleDatasets --> to get the list of datasetsSearch --> to get the lists of products, including associated metadata (e.g. geometry, cloud cover, orbit, tile, ...)GetData --> to retrieve the product(s). It includes options for subsetting products in space and time, for downloading at native data granularity and with reduced processing capacity1 - AuthThis module takes care of user authentication and authorization.Without instancing an object of this module other components don't work.Auth module is based on the ADAMAPI_KEY, a key that uniquelly identifies the user.Class contructor and parametersfromadamapiimportAutha=Auth()Parameters:position/keywordmandatorytypedefaultdescriptionPublic methods and parameters.setKey()--> To setup the ADAMAPI_KEYParameters:position/keywordmandatorytypedefaultdescription0TruestrThe ADAMAPI_KEY.setAdamCore()--> To setup the url of the ADAM-CORE endpointParameters:position/keywordmandatorytypedefaultdescription0TruestrThe url likehttps://test.adamplatform.eu.authorize()--> to instanciate an auth objectParameters:position/keywordmandatorytypedefaultdescription.getAuthToken()--> to get the authorization tokenParameters:position/keywordmandatorytypedefaultdescription1.1 - ADAMAPI_KEY retrievalTo get the ADAMAPI_KEY, you need to access your ADAM portal and:Select the "user icon" on the top rightExpand / click the "USERNAME"Click on the "Api Key" to display your key*Command-line ADAMAPI_KEY retrieval TBP*1.2 - ADAMAPI_KEY setupThere are three methods to setup the ADAMAPI_KEY and the ADAM-CORE instance:use the method setKey() and setAdamCore()fromadamapiimportAutha=Auth()a.setKey('<ADAMAPI_KEY>')a.setAdamCore('https://test.adamplatform.eu')Export two envars like#open a Terminal and type:exportADAMAPI_KEY='<ADAMAPI_KEY>'exportADAMAPI_URL='https://test.adamplatform.eu'create a file called.adamapircin the user home directory with the following contentkey=<ADAMAPI_KEY> url=https://test.adamplatform.eu1.3 - ExamplesAfter ADAMAPI_KEY has been set up, an auth instance can be created with:fromadamapiimportAutha=Auth()a.authorize()After authorize method you can retrive your autho token:fromadamapiimportAutha=Auth()a.authorize()a.getAuthToken()2 - DatasetsThis module provides datasets discovery functionality.Class contructor and parametersfromadamapiimportDatasetsdatasets=Datasets(a)Parameters:position/keywordmandatorytypedefaultdescription0TrueAuth instanceThe ADAMAPI authorized instance obtained in the previous sectionPublic methods and parameters.getDatasets()--> To retrieve datasets listParameters:position/keywordmandatorytypedefaultdescription0FalsestrThe datasetId.pageFalsenumeric0Indicats a specific pagemaxRecordsFalsenumeric10Max number of results in output.This .getDatasets() function can be used to retrive additional filters which are described in the keyfiltersEnabled(if exists).2.1 ExamplesThis module can be used in 2 different ways.To list all available datasets:datasets=Datasets(a)print(datasets.getDatasets())To get detailed metadata about a specific datasetdatasets=Datasets(a)print(datasets.getDatasets('{{ID:DATASET}}',page=0,maxRecords=10))To get filtersEnabled. To use this additional filters see first example in Search section.datasets=Datasets(a)out=datasets.getDatasets("{{ID:DATASET}}")print(out["filtersEnabled"])3 - SearchThis module provides discovery functionality through the products available on the ADAM instance.Class contructor and parametersfromadamapiimportSearchsearch=Search(a)Parameters:position/keywordmandatorytypedefaultdescription0TrueAuth instanceThe ADAMAPI authorized instance obtained in section 1-AuthPublic methods and parameters.getProducts()--> To retrieve datasets list and metadataParameters:position/keywordmandatorytypedefaultdescription0TruestrThe datasetId.maxRecordsFalseint10number of recordsstartIndexFalseint0starting record indexstartDateFalsestr ordatetimethe start dateendDateFalsestr ordatetimethe end dategeometryFalsestr or geojsonGeoJson geometry,geojson formatappendix3.1 ExamplesExample1:search=Search(a)mongo_search=search.getProducts('{{ID:DATASET}}',maxRecords=1,startIndex=0,platform="{{VALUE}}")Example2:search=Search(a)mongo_search=search.getProducts('{{ID:DATASET}}',maxRecords=1,startIndex=0)4 - GetDataThis module provides data access of raster, spatial subset, timeseries in the native data granularity and reduced processing capacity.Class contructor and parametersfromadamapiimportGetDatadata=GetData(a)Parameters:position/keywordmandatorytypedefaultdescription0TrueAuth InstanceThe ADAMAPI authorized instance obtained in the section 1-AuthPublic methods and parameters.getData()--> To retrieve a specific product or a dataset in its native granularity, to get a subset of it, to perform a timeseries or to exec simple processingposition/keywordmandatorytypedefaultdescription0TruestrThe datasetId1TruestrGetFilerequest type. available values: GetFile,GetSubset, GetTimeseries and GetProcessingasynchronousFalsebooleanFalserappesents how the request will be performedcompressFalsebooleanFalsereturn a zip filerestFalsebooleanTrueperform RESTful order ignoring explorer state on the server and equalization configured using the explorer guifiltersTruejson{}json object with filters parameter. startDate and endDate are required inside it. Geometry is not required for GetFile operation, it is otherwiseoptionsFalsejson{}request optionoutputDirFalsestradamapiresults/set a different download directory insideadamapiresult/main directory4.1 Examplesdata=GetData(a)#to retrive a specific productimage=data.getData('{{ID:DATASET}}',"GetFile",asynchronous=False,compress=False,rest=False,filters={"startDate":'{{STARTDATE}}',"endDate":'{{ENDDATE}}',"productId":'{{PRODUCTID}}'},outputDir='{{OUTPUT_DIR}}')#to retrieve a dataset in its native granularitydata=GetData(self.a)image=data.getData('{{ID:DATASET}}',"GetFile",asynchronous=False,compress=False,rest=False,filters={"startDate":'{{STARTDATE}}',"endDate":'{{ENDDATE}}',"geometry":'{{GEOMETRY}}'},outputDir='{{OUTPUT_DIR}}')For the GetSubset,GetTimeseries and GetProcessing requests you need to add theoptionsparameter with these constraints :output formatsandfunctions(only for processing request)#subset exampleimage=data.getData('{{ID:DATASET}}',"GetSubset",asynchronous=False,compress=False,rest=False,filters={"startDate":'{{STARTDATE}}',"endDate":'{{ENDDATE}}',"geometry":'{{GEOMETRY}}'},options={"format":'{{FORMATS}}'},outputDir='{{OUTPUT_DIR}}')#timeseries exampleimage=data.getData('{{ID:DATASET}}',"GetTimeseries",asynchronous=False,compress=False,rest=False,filters={"startDate":'{{STARTDATE}}',"endDate":'{{ENDDATE}}',"geometry":'{{GEOMETRY}}'},options={"format":'{{FORMATS}}'},outputDir='{{OUTPUT_DIR}}')#processing exampleimage=data.getData('{{ID:DATASET}}',"GetProcessing",asynchronous=False,compress=False,rest=False,filters={"startDate":'{{STARTDATE}}',"endDate":'{{ENDDATE}}',"geometry":'{{GEOMETRY}}'},options={"format":'{{FORMAT}}',"function":'{{FUNCTION}}'},outputDir='{{OUTPUT_DIR}}')4.3 Asyncronous Example#1. execute the requestimage=data.getData('{{ID:DATASET}}',"GetSubset",asynchronous=False,compress=False,rest=False,filters={"startDate":'{{STARTDATE}}',"endDate":'{{ENDDATE}}',"geometry":'{{GEOMETRY}}'},options={"format":'{{FORMATS}}'},outputDir='{{OUTPUT_DIR}}')#2. check the statusstat=data.getData(datasetId,"GetSubset",asynchronous=True,id=str(image.pk))whilestat.status!="completed":time.sleep(1)stat=data.getData(datasetId,"GetSubset",asynchronous=True,id=str(image.pk))#3. download the zip,unzip it and remove the zip (optional)forresinstat.list:ifres["status"]=="failed":print(res["exit_code"])else:r=self.a.client(res["download"]["url"],{},"GET")withopen(str(res["download"]["url"].split("/")[4])+"_"+str(res["download"]["url"].split("/")[5]),'wb')asf:f.write(r.content)Appendix 1 - Data formatdate and date+timeSupported string date/date+time format are:'%Y-%m-%dT%H:%M:%S','%Y-%m-%dT%H:%M:%SZ','%Y-%m-%d'GeoJsonGeometry have to follow the latest geojson standardrfc7946In particular Polygons and MultiPolygons should follow the right-hand ruleGeometry#This geometry will return all the results it has intersected within itgeometry={"type":"Polygon","coordinates":[[[43.916666667,15.716666667],[43.916666667,15.416666667],[44.216666667,15.416666667],[44.216666667,15.716666667],[43.916666667,15.716666667]]]}#This geometry will return all the results it has intersected on its outsidegeometry={"type":"Polygon","coordinates":[[[43.84986877441406,15.925676536359038],[44.6539306640625,15.950766025306109],[44.681396484375,15.194084972583916],[43.8189697265625,15.20998780073036],[43.84986877441406,15.925676536359038]]]}Output Formatsrequestoutput formatGetFile-GetSubsettiff,pngGetTimeseriesjson,csvGetProcessingexperimentaltiff,pngProcessing FunctiontypedescriptionaverageWhen the GetProcessing retrieves a multi-band product or a set of products it executes the average of their valuesoverlapWhen the GetProcessing retrieves a set of products, it executes their overlap without any specific strategymosterecentWhen the GetProcessing retrieves a set of products, it puts on the top the most recent oneleastrecentWhen the GetProcessing retrieves a set of products, it puts on top the least recent oneminvalueWhen the GetProcessing retrieves a multi-band product or a set of products for each pixel it puts on top the minimum value of the pixelmaxvalueWhen the GetProcessing retrieves a multi-band product or a set of products for each pixel it for each pixel, puts on top the maximum value of the pixel
adamatics-keycloak
Adamatics Keycloakadamatics-keycloakis a Python package providing access to the Keycloak API.This is a forked version of the python-keycloak library (https://github.com/marcospereirampj/python-keycloak). This fork tries to expand upon the functionality and provide more stability and updates to the package.This package is mainly maintained byAdamaticscompany.InstallationVia Pypi Package:$ pip install adamatics-keycloakManually$ python setup.py installDependenciesadamatics-keycloak depends on:Python 3requestspython-joseurllib3Test and Build Dependenciestoxpytestpytest-covwheelBug reportsPlease report bugs and feature requests athttps://github.com/adamatics/adamatics-keycloak/issuesContributorsAgriness TeamMarcos PereiraMartin DevlinShon T. UrbasMarkus SpanierRemco KranenburgArminnjordrJosha InglisAlexEwan JoneLukas MartiniAdamaticsUsagefromkeycloakimportKeycloakOpenID# Configure clientkeycloak_openid=KeycloakOpenID(server_url="http://localhost:8080/auth/",client_id="example_client",realm_name="example_realm",client_secret_key="secret")# Get WellKnowconfig_well_know=keycloak_openid.well_know()# Get Tokentoken=keycloak_openid.token("user","password")token=keycloak_openid.token("user","password",totp="012345")# Get Userinfouserinfo=keycloak_openid.userinfo(token['access_token'])# Refresh tokentoken=keycloak_openid.refresh_token(token['refresh_token'])# Logoutkeycloak_openid.logout(token['refresh_token'])# Get Certscerts=keycloak_openid.certs()# Get RPT (Entitlement)token=keycloak_openid.token("user","password")rpt=keycloak_openid.entitlement(token['access_token'],"resource_id")# Instropect RPTtoken_rpt_info=keycloak_openid.introspect(keycloak_openid.introspect(token['access_token'],rpt=rpt['rpt'],token_type_hint="requesting_party_token"))# Introspect Tokentoken_info=keycloak_openid.introspect(token['access_token'])# Decode TokenKEYCLOAK_PUBLIC_KEY="-----BEGIN PUBLIC KEY-----\n"+keycloak_openid.public_key()+"\n-----END PUBLIC KEY-----"options={"verify_signature":True,"verify_aud":True,"verify_exp":True}token_info=keycloak_openid.decode_token(token['access_token'],key=KEYCLOAK_PUBLIC_KEY,options=options)# Get permissions by tokentoken=keycloak_openid.token("user","password")keycloak_openid.load_authorization_config("example-authz-config.json")policies=keycloak_openid.get_policies(token['access_token'],method_token_info='decode',key=KEYCLOAK_PUBLIC_KEY)permissions=keycloak_openid.get_permissions(token['access_token'],method_token_info='introspect')# KEYCLOAK ADMINfromkeycloakimportKeycloakAdminkeycloak_admin=KeycloakAdmin(server_url="http://localhost:8080/auth/",username='example-admin',password='secret',realm_name="master",user_realm_name="only_if_other_realm_than_master",client_secret_key="client-secret",verify=True)# Add usernew_user=keycloak_admin.create_user({"email":"[email protected]","username":"[email protected]","enabled":True,"firstName":"Example","lastName":"Example"})# Add user and raise exception if username already exists# exist_ok currently defaults to True for backwards compatibility reasonsnew_user=keycloak_admin.create_user({"email":"[email protected]","username":"[email protected]","enabled":True,"firstName":"Example","lastName":"Example"},exist_ok=False)# Add user and set passwordnew_user=keycloak_admin.create_user({"email":"[email protected]","username":"[email protected]","enabled":True,"firstName":"Example","lastName":"Example","credentials":[{"value":"secret","type":"password",}]})# Add user and specify a localenew_user=keycloak_admin.create_user({"email":"[email protected]","username":"[email protected]","enabled":True,"firstName":"Example","lastName":"Example","attributes":{"locale":["fr"]})# User countercount_users=keycloak_admin.users_count()# Get users Returns a list of users, filtered according to query parametersusers=keycloak_admin.get_users({})# Get user ID from nameuser_id_keycloak=keycloak_admin.get_user_id("[email protected]")# Get Useruser=keycloak_admin.get_user("user-id-keycloak")# Update Userresponse=keycloak_admin.update_user(user_id="user-id-keycloak",payload={'firstName':'Example Update'})# Update User Passwordresponse=keycloak_admin.set_user_password(user_id="user-id-keycloak",password="secret",temporary=True)# Get User Credentialscredentials=keycloak_admin.get_credentials(user_id='user_id')# Get User Credential by IDcredential=keycloak_admin.get_credential(user_id='user_id',credential_id='credential_id')# Delete User Credentialresponse=keycloak_admin.delete_credential(user_id='user_id',credential_id='credential_id')# Delete Userresponse=keycloak_admin.delete_user(user_id="user-id-keycloak")# Get consents granted by the userconsents=keycloak_admin.consents_user(user_id="user-id-keycloak")# Send User Actionresponse=keycloak_admin.send_update_account(user_id="user-id-keycloak",payload=json.dumps(['UPDATE_PASSWORD']))# Send Verify Emailresponse=keycloak_admin.send_verify_email(user_id="user-id-keycloak")# Get sessions associated with the usersessions=keycloak_admin.get_sessions(user_id="user-id-keycloak")# Get themes, social providers, auth providers, and event listeners available on this serverserver_info=keycloak_admin.get_server_info()# Get clients belonging to the realm Returns a list of clients belonging to the realmclients=keycloak_admin.get_clients()# Get client - id (not client-id) from client by nameclient_id=keycloak_admin.get_client_id("my-client")# Get representation of the client - id of client (not client-id)client=keycloak_admin.get_client(client_id="client_id")# Get all roles for the realm or clientrealm_roles=keycloak_admin.get_realm_roles()# Get all roles for the clientclient_roles=keycloak_admin.get_client_roles(client_id="client_id")# Get client rolerole=keycloak_admin.get_client_role(client_id="client_id",role_name="role_name")# Warning: Deprecated# Get client role id from namerole_id=keycloak_admin.get_client_role_id(client_id="client_id",role_name="test")# Create client rolekeycloak_admin.create_client_role(client_role_id='client_id',{'name':'roleName','clientRole':True})# Assign client role to user. Note that BOTH role_name and role_id appear to be required.keycloak_admin.assign_client_role(client_id="client_id",user_id="user_id",role_id="role_id",role_name="test")# Retrieve client roles of a user.keycloak_admin.get_client_roles_of_user(user_id="user_id",client_id="client_id")# Retrieve available client roles of a user.keycloak_admin.get_available_client_roles_of_user(user_id="user_id",client_id="client_id")# Retrieve composite client roles of a user.keycloak_admin.get_composite_client_roles_of_user(user_id="user_id",client_id="client_id")# Delete client roles of a user.keycloak_admin.delete_client_roles_of_user(client_id="client_id",user_id="user_id",roles={"id":"role-id"})keycloak_admin.delete_client_roles_of_user(client_id="client_id",user_id="user_id",roles=[{"id":"role-id_1"},{"id":"role-id_2"}])# Get all client authorization resourcesclient_resources=get_client_authz_resources(client_id="client_id")# Get all client authorization scopesclient_scopes=get_client_authz_scopes(client_id="client_id")# Get all client authorization permissionsclient_permissions=get_client_authz_permissions(client_id="client_id")# Get all client authorization policiesclient_policies=get_client_authz_policies(client_id="client_id")# Create new groupgroup=keycloak_admin.create_group({"name":"Example Group"})# Get all groupsgroups=keycloak_admin.get_groups()# Get groupgroup=keycloak_admin.get_group(group_id='group_id')# Get group by namegroup=keycloak_admin.get_group_by_path(path='/group/subgroup',search_in_subgroups=True)# Function to trigger user sync from providersync_users(storage_id="storage_di",action="action")# Get client role id from namerole_id=keycloak_admin.get_client_role_id(client_id=client_id,role_name="test")# Get all roles for the realm or clientrealm_roles=keycloak_admin.get_roles()# Assign client role to user. Note that BOTH role_name and role_id appear to be required.keycloak_admin.assign_client_role(client_id=client_id,user_id=user_id,role_id=role_id,role_name="test")# Assign realm roles to userkeycloak_admin.assign_realm_roles(user_id=user_id,roles=realm_roles)# Get all ID Providersidps=keycloak_admin.get_idps()# Create a new Realmkeycloak_admin.create_realm(payload={"realm":"demo"},skip_exists=False)
adam-authsession
No description available on PyPI.
adam-core
Asteroid Institute ADAM CoreA set of shared astrodynamics libraries and utilities.adam_coreis used by a variety of library and services at Asteroid Institute. Sharing these common classes, types, and conversions amongst our tools ensures consistency and accuracy.UsageOrbitsTo define an orbit:fromastropy.timeimportTimefromadam_core.coordinatesimportKeplerianCoordinatesfromadam_core.coordinatesimportTimesfromadam_core.coordinatesimportOriginfromadam_core.orbitsimportOrbitskeplerian_elements=KeplerianCoordinates.from_kwargs(times=Times.from_astropy(Time([59000.0],scale="tdb",format="mjd")),a=[1.0],e=[0.002],i=[10.],raan=[50.0],ap=[20.0],M=[30.0],origin=Origin.from_kwargs(code=["SUN"]),frame="ecliptic")orbits=Orbits.from_kwargs(orbit_ids=["1"],object_ids=["Test Orbit"],coordinates=keplerian_elements.to_cartesian(),)Note that internally, all orbits are stored in Cartesian coordinates. Cartesian coordinates do not have any singularities and are thus more robust for numerical integration. Any orbital element conversions to Cartesian can be done on demand by callingto_cartesian()on the coordinates object.The underlying orbits class is 2 dimensional and can store elements and covariances for multiple orbits.fromastropy.timeimportTimefromadam_core.coordinatesimportKeplerianCoordinatesfromadam_core.coordinatesimportTimesfromadam_core.coordinatesimportOriginfromadam_core.orbitsimportOrbitskeplerian_elements=KeplerianCoordinates.from_kwargs(times=Times.from_astropy(Time([59000.0,60000.0],scale="tdb",format="mjd")),a=[1.0,3.0],e=[0.002,0.0],i=[10.,30.],raan=[50.0,32.0],ap=[20.0,94.0],M=[30.0,159.0],origin=Origin.from_kwargs(code=["SUN","SUN"]),frame="ecliptic")orbits=Orbits.from_kwargs(orbit_ids=["1","2"],object_ids=["Test Orbit 1","Test Orbit 2"],coordinates=keplerian_elements.to_cartesian(),)Orbits can be easily converted to a pandas DataFrame:orbits.to_dataframe()orbit_idsobject_idstimes.jd1times.jd2xyzvxvyvz...cov_vy_ycov_vy_zcov_vy_vxcov_vy_vycov_vz_xcov_vz_ycov_vz_zcov_vz_vxcov_vz_vycov_vz_vz01TestOrbit12459000.00.5-0.1664030.9752730.133015-0.016838-0.0031170.001921...NaNNaNNaNNaNNaNNaNNaNNaNNaNNaN12TestOrbit22460000.00.50.572777-2.571820-1.4344570.0093870.002900-0.001452...NaNNaNNaNNaNNaNNaNNaNNaNNaNNaNOrbits can also be defined with uncertainties.importnumpyasnpfromastropy.timeimportTimefromadam_core.coordinatesimportKeplerianCoordinatesfromadam_core.coordinatesimportTimesfromadam_core.coordinatesimportOriginfromadam_core.coordinatesimportCoordinateCovariancesfromadam_core.orbitsimportOrbitskeplerian_elements=KeplerianCoordinates.from_kwargs(times=Times.from_astropy(Time([59000.0],scale="tdb",format="mjd")),a=[1.0],e=[0.002],i=[10.],raan=[50.0],ap=[20.0],M=[30.0],covariances=CoordinateCovariances.from_sigmas(np.array([[0.002,0.001,0.01,0.01,0.1,0.1]])),origin=Origin.from_kwargs(code=["SUN"]),frame="ecliptic")orbits=Orbits.from_kwargs(orbit_ids=["1"],object_ids=["Test Orbit with Uncertainties"],coordinates=keplerian_elements.to_cartesian(),)orbits.to_dataframe(sigmas=True)orbit_idsobject_idstimes.jd1times.jd2xyzvxvyvz...cov_vy_ycov_vy_zcov_vy_vxcov_vy_vycov_vz_xcov_vz_ycov_vz_zcov_vz_vxcov_vz_vycov_vz_vz01TestOrbitwithUncertainties2459000.00.5-0.1664030.9752730.133015-0.016838-0.0031170.001921...3.625729e-08-1.059731e-08-9.691716e-111.872922e-091.392222e-08-1.759744e-09-1.821839e-09-7.865582e-112.237521e-103.971297e-11To query orbits from JPL Horizons:fromastropy.timeimportTimefromadam_core.orbits.queryimportquery_horizonstimes=Time([60000.0],scale="tdb",format="mjd")object_ids=["Duende","Eros","Ceres"]orbits=query_horizons(object_ids,times)To query orbits from JPL SBDB:fromadam_core.orbits.queryimportquery_sbdbobject_ids=["Duende","Eros","Ceres"]orbits=query_sbdb(object_ids)Orbital Element ConversionsOrbital elements can be accessed via the corresponding attribute. All conversions, including covariances, are done on demand and stored.# Cartesian Elementsorbits.coordinates# To convert to other representationscometary_elements=orbits.coordinates.to_cometary()keplerian_elements=orbits.coordinates.to_keplerian()spherical_elements=orbits.coordinates.to_spherical()PropagatorThe propagator class inadam_coreprovides a generalized interface to the supported orbit integrators and ephemeris generators. By default,adam_coreships with PYOORB.To propagate orbits with PYOORB (here we grab some orbits from Horizons first):importnumpyasnpfromastropy.timeimportTimefromastropyimportunitsasufromadam_core.orbits.queryimportquery_horizonsfromadam_core.propagatorimportPYOORB# Get orbit to propagateinitial_time=Time([60000.0],scale="tdb",format="mjd")object_ids=["Duende","Eros","Ceres"]orbits=query_horizons(object_ids,initial_time)# Make sure PYOORB is readypropagator=PYOORB()# Define propagation timestimes=initial_time+np.arange(0,100)*u.d# Propagate orbits! This function supports multiprocessing for large# propagation jobs.propagated_orbits=propagator.propagate_orbits(orbits,times,chunk_size=100,num_jobs=1,)Low-level APIsGetting the heliocentric ecliptic state vector of a DE440 body at a given set of times (in this case the barycenter of the Jovian system):importnumpyasnpfromastropy.timeimportTimefromadam_core.coordinates.originimportOriginCodesfromadam_core.utils.spiceimportget_perturber_statestates=get_perturber_state(OriginCodes.JUPITER_BARYCENTER,Time(np.arange(59000,60000),format="mjd",scale="tdb"),frame="ecliptic",origin=OriginCodes.SUN,)Package Structureadam_core ├──constants.py# Shared constants├──coordinates# Coordinate classes and transformations├──dynamics# Numerical solutions├──orbits# Orbits class and query utilities└──utils# Utility classes like Indexable or conversions like times_from_dfInstallationADAM Core is available on PyPIpipinstalladam_coreDevelopmentDevelopment is made easy with our Docker container environment.# Build the containerdockercomposebuild# Run tests in the containerdockercomposerunadam_corepytest.# Run a shell in the containerdockercomposerunadam_corebash
adam-credmanager
Clase para acceso a objetos en secret manager
adam-daphne
Daphne is a HTTP, HTTP2 and WebSocket protocol server forASGIandASGI-HTTP, developed to power Django Channels.It supports automatic negotiation of protocols; there’s no need for URL prefixing to determine WebSocket endpoints versus HTTP endpoints.RunningSimply point Daphne to your ASGI application, and optionally set a bind address and port (defaults to localhost, port 8000):daphne -b 0.0.0.0 -p 8001 django_project.asgi:applicationIf you intend to run daphne behind a proxy server you can use UNIX sockets to communicate between the two:daphne -u /tmp/daphne.sock django_project.asgi:applicationIf daphne is being run inside a process manager, you might want it to bind to a file descriptor passed down from a parent process. To achieve this you can use the –fd flag:daphne --fd 5 django_project.asgi:applicationIf you want more control over the port/socket bindings you can fall back to usingtwisted’s endpoint description stringsby using the–endpoint (-e)flag, which can be used multiple times. This line would start a SSL server on port 443, assuming thatkey.pemandcrt.pemexist in the current directory (requires pyopenssl to be installed):daphne -e ssl:443:privateKey=key.pem:certKey=crt.pem django_project.asgi:applicationEndpoints even let you use thetxacmeendpoint syntax to get automatic certificates from Let’s Encrypt, which you can read more about athttp://txacme.readthedocs.io/en/stable/.To see all available command line options run daphne with the-hflag.HTTP/2 SupportDaphne supports terminating HTTP/2 connections natively. You’ll need to do a couple of things to get it working, though. First, you need to make sure you install the Twistedhttp2andtlsextras:pip install -U 'Twisted[tls,http2]'Next, because all current browsers only support HTTP/2 when using TLS, you will need to start Daphne with TLS turned on, which can be done using the Twisted endpoint syntax:daphne -e ssl:443:privateKey=key.pem:certKey=crt.pem django_project.asgi:applicationAlternatively, you can use thetxacmeendpoint syntax or anything else that enables TLS under the hood.You will also need to be on a system that hasOpenSSL 1.0.2 or greater; if you are using Ubuntu, this means you need at least Ubuntu 16.04.Now, when you start up Daphne, it should tell you this in the log:2017-03-18 19:14:02,741 INFO Starting server at ssl:port=8000:privateKey=privkey.pem:certKey=cert.pem, channel layer django_project.asgi:channel_layer. 2017-03-18 19:14:02,742 INFO HTTP/2 support enabledThen, connect with a browser that supports HTTP/2, and everything should be working. It’s often hard to tell that HTTP/2 is working, as the log Daphne gives you will be identical (it’s HTTP, after all), and most browsers don’t make it obvious in their network inspector windows. There are browser extensions that will let you know clearly if it’s working or not.Daphne only supports “normal” requests over HTTP/2 at this time; there is not yet support for extended features like Server Push. It will, however, result in much faster connections and lower overheads.If you have a reverse proxy in front of your site to serve static files or similar, HTTP/2 will only work if that proxy understands and passes through the connection correctly.Root Path (SCRIPT_NAME)In order to set the root path for Daphne, which is the equivalent of the WSGISCRIPT_NAMEsetting, you have two options:Pass a header valueDaphne-Root-Path, with the desired root path as a URLencoded ASCII value. This header will not be passed down to applications.Set the--root-pathcommandline option with the desired root path as a URLencoded ASCII value.The header takes precedence if both are set. As withSCRIPT_ALIAS, the value should start with a slash, but not end with one; for example:daphne --root-path=/forum django_project.asgi:applicationPython SupportDaphne requires Python 3.6 or later.ContributingPlease refer to themain Channels contributing docs.To run tests, make sure you have installed thetestsextra with the package:cd daphne/ pip install -e '.[tests]' pytestMaintenance and SecurityTo report security issues, please [email protected]. For GPG signatures and more security process information, seehttps://docs.djangoproject.com/en/dev/internals/security/.To report bugs or request new features, please open a new GitHub issue.This repository is part of the Channels project. For the shepherd and maintenance team, please see themain Channels readme.
adamdobson2048
No description available on PyPI.
adamet
Adaptive Metropolis for Bayesian AnalysisAdaMet is a well-tested Python implementation byCappellari et al. (2013)of the Adaptive Metropolis algorithm byHaario H., Saksman E., Tamminen J. (2001). It was used in a number of published papers in the astrophysics literature.ContentsAttributionInstallationDocumentationAdaMet PurposeUsage ExampleCalling SequenceInput ParametersOptional KeywordsOutput ParametersLicenseAttributionIf you use this software for your research, please cite at leastCappellari et al. (2013)where the implementation was introduced. The BibTeX entry for the paper is:@ARTICLE{Cappellari2013a, author = {{Cappellari}, M. and {Scott}, N. and {Alatalo}, K. and {Blitz}, L. and {Bois}, M. and {Bournaud}, F. and {Bureau}, M. and {Crocker}, A.~F. and {Davies}, R.~L. and {Davis}, T.~A. and {de Zeeuw}, P.~T. and {Duc}, P.-A. and {Emsellem}, E. and {Khochfar}, S. and {Krajnovi{\'c}}, D. and {Kuntschner}, H. and {McDermid}, R.~M. and {Morganti}, R. and {Naab}, T. and {Oosterloo}, T. and {Sarzi}, M. and {Serra}, P. and {Weijmans}, A.-M. and {Young}, L.~M.}, title = "{The ATLAS$^{3D}$ project - XV. Benchmark for early-type galaxies scaling relations from 260 dynamical models: mass-to-light ratio, dark matter, Fundamental Plane and Mass Plane}", journal = {MNRAS}, eprint = {1208.3522}, year = 2013, volume = 432, pages = {1709-1741}, doi = {10.1093/mnras/stt562} }Installationinstall with:pip install adametWithout writing access to the globalsite-packagesdirectory, use:pip install --user adametDocumentationThe documentation is in the docstring of the fileadamet.pyor onPyPi.AdaMet PurposeThis is the implementation byCappellari et al. (2013)of the Adaptive Metropolis algorithm byHaario H., Saksman E., Tamminen J. (2001)for Bayesian analysis.Usage ExampleTo learn how to useAdaMetrun the example program in theadamet/examplesdirectory, within the main package installation folder insidesite-packages, and read the detailed documentation in the docstring of the fileadamet.pyor onPyPi.Note: For dimensions = 1 to 6, the optimal acceptance rates arerate = [0.441, 0.352, 0.316, 0.279, 0.275, 0.266]and the asymptotic value for many parameters is 23%Calling Sequencepars,lnprob=adamet(lnprob_fun,pars0,sigpars0,bounds,nstep,labels=None,nprint=100,quiet=False,fignum=None,plot=True,labels_scaling=1,seed=None,args=(),kwargs={})Input Parameterslnprob_fun: callableThis function returns the natural logarithm of the conditional probability of the model, given the data:P(model | data) ~ P(data | model) P(model)pars0: array_like with shape (n,)vector with the mean of the multivariate Gaussian describing the proposal distribution from which samples are drawn. For maximum efficiency, this initial Gaussian should approximate the posterior distribution. This suggests adopting aspars0an initial guess for the model best-fitting parameters.sigpars0: array_like with shape (n,)vector with the dispersionsigmaof the multivariate Gaussian describing the proposal distribution. For maximum efficiency, this initial Gaussian should approximate the posterior distribution. This suggests adopting assigparsan initial guess of the uncertainty in the model parameterspars.bounds: 2-tuple of array_likeLower and upper bounds on independent variables. Each array must match the size ofpars. The model probability is set to zero outside the bounds. This keyword is also used to define the plotting ranges.nsteps: integerNumber of attempted moves in the chain. Typical numbers are a few thousandsnsteps.Optional Keywordslabels: array_like with shape (n,)String labels for each parameter to be used in thecorner_plotnprint: integerSpecifies the frequency for the intermediate plots, in moves. A typical value could benstep/10.plot: boolean, optionalSpecifies whether to show a plot of the results or not.fignum: integer, optionalSpecifies the figure number for the plot.labels_scaling: floatRelative scaling for the plotting labels.seed: integerSeed for the random generator. Specify this value for a repeatable random sequence.args, kwargs: tuple and dict, optionalAdditional arguments passed tolnprob_fun. Both empty by default. The calling signature islnprob_fun(x, *args, **kwargs).Output Parameterspars: array_like with shape (nsteps, n)Posterior distribution for the model parameterslnprob: array_like with shape (nsteps, n)Logarithm of the probbaility of the model, given the data, for each set of parameters in the posterior distributionpars.LicenseOther/Proprietary LicenseCopyright (c) 2012-2020 Michele CappellariThis software is provided as is without any warranty whatsoever. Permission to use, for non-commercial purposes is granted. Permission to modify for personal or internal use is granted, provided this copyright and disclaimer are included in all copies of the software. All other rights are reserved. In particular, redistribution of the code is not allowed.
adam-fdc-newypei
Example PackageThis is a simple example package. You can useGithub-flavored Markdownto write your content.test test test test test
adamgpppythoncodestorage
No description available on PyPI.
adamic
adamic PackageOverviewThis package contains the code to build a data dictionary when passed a Pandas dataframe.InstallationThis package is available onpip. To install, run the following from your preferred shell:pipinstalladamicUseAfter installing the package to your environment, import the package to your script, Jupyter notebook, or directly to thepython3command line.fromadamicimportadamicTo create your data dictionary, pass a Pandas dataframe to thecreate_data_dictionary()function:adamic.create_data_dictionary(sample_df)The package will prompt you to supply definitions for each variable in the dataset. HitEnterafter supplying definition or if you want to define the variable later after the output file has been created.Finally, you will be prompted to name your preferred file extension..csv,.json, and.xlsxare the available options.CreditI've borrowed ideas for this package (especially the method of adding definitions) fromthis Medium article.
adam-io
ADAM 6050-D REST API ImplementationADAM 6000-Series Manual6050-D has 12 digital inputs and 6 digital outputs. It can be controlled with http requests. The aim of this repo is to wrap the ADAM API into a convenience module for Event Gates' deep learning platform VIS.Installationpip install adam-ioUsage ExamplesYou can update output state in three steps;Create the ADAM objectCreate/reuse DigitalOutputCall ADAM’s output method with DigitalOutputRead the input state;Call ADAM’s input methodCreate adam objectIP, username, password of ADAM should be already set from APEXfromadamimportADAM6050DasADAMip='192.168.1.1'username='user'password='pass'adam=ADAM(ip,username,password)Construct the digital output objectTo change the state of the outputs, you should create/reuse a DigitalOutput object After creating the object, the initial state is empty, so making a request straight away changes nothing.fromdigital_ioimportDigitalOutputdo=DigitalOutput()# set every available output to 1do[0]=1do[1]=1do[2]=1do[3]=1do[4]=1do[5]=1You don’t have to set every bit, you can just change the ones you need.fromdigital_ioimportDigitalOutputdo=DigitalOutput()# set DO0 to 1 and DO5 to 0do[0]=1do[5]=0DigitalOutput accepts an array to set the outputs all at oncefromdigital_ioimportDigitalOutput# set every available output to 1initial_array=[1,1,1,1,1,1,1]do=DigitalOutput(array=initial_array)Change the stateAfter creating adam and setting the digital outputs, make the request by calling the output method of ADAM and pass the digitalOuput object as argument.fromadamimportADAM6050DasADAMfromdigital_ioimportDigitalOutputip='192.168.1.1'username='user'password='pass'adam=ADAM(ip,username,password)do=DigitalOutput()# set DO0 to 1 and DO5 to 0do[0]=1do[5]=0# request the state changetry:adam.output(do)exceptExceptionaserr:print(err)Read the state of outputYou can get the current state by calling the digitalOutput object without an argumentcurrent_output=adam.output()# state of DO0current_output[0]Read the state of inputTo read the input state, call input() on ADAM. You can pass in the id of a specific input if you want. Otherwise every input value is retrievedinput_id=0di_0=adam.input(input_id)# value of DI0print(di_0)di=adam.input(input_id)# digital inputsprint(di[0])# DI0print(di[1])# DI1###print(di[10])# DI10print(di[11])# DI11LICENSEMIT LicenseCopyright (c) 2020 Event GatesPermission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
adamix-gpt2
Adapting GPT-2 using LoRA, Adapters and AdamixThis folder contains the implementation of LoRA, Adamix Adapter and Adamix LoRA in GPT-2 using the modiifed Python packageloraand steps to replicate the results in our recent paperThis repo reproduces our experiments on GPT-2 Medium.Repository OverviewOur implementation is based on the fine-tuning code for GPT-2 inHugging Face. There are several directories in this repo:src/contains the source code used for data processing, training, and decoding.eval/contains the code for task-specific evaluation scripts.data/contains the raw data we used in our experiments.vocab/contains the GPT-2 vocabulary files.Getting StartedYou can start with the following docker image:nvcr.io/nvidia/pytorch:20.03-py3on a GPU-capable machine, but any generic PyTorch image should work.docker run -it nvcr.io/nvidia/pytorch:20.03-py3Clone the repo and install dependencies using the provided setup script in a virtual environment (remove sudo wherever necessary if running in docker container):bash setup.shNow we are ready to replicate the results in our paper.Replicating Our Results on GPT-2 Medium(see our paper for hyperparameters for GPT-2 Medium)Train GPT-2 Medium on E2E NLG Challenge datasetLoRApython -m torch.distributed.launch --nproc_per_node=1 src/gpt2_ft.py \ --train_data ./data/e2e/train.jsonl \ --valid_data ./data/e2e/valid.jsonl \ --train_batch_size 8 \ --grad_acc 1 \ --valid_batch_size 4 \ --seq_len 512 \ --model_card gpt2.md \ --init_checkpoint ./pretrained_checkpoints/gpt2-medium-pytorch_model.bin \ --platform local \ --clip 0.0 \ --lr 0.0002 \ --weight_decay 0.01 \ --correct_bias \ --adam_beta2 0.999 \ --scheduler linear \ --warmup_step 500 \ --max_epoch 5 \ --save_interval 1000 \ --lora_dim 4 \ --lora_alpha 32 \ --lora_dropout 0.1 \ --label_smooth 0.1 \ --work_dir ./trained_models/GPT2_M/e2e/lora_only \ --random_seed 110 \ --lora_only 1Adapter with Adamixpython -m torch.distributed.launch --nproc_per_node=1 src/gpt2_ft.py \ --train_data ./data/e2e/train.jsonl \ --valid_data ./data/e2e/valid.jsonl \ --train_batch_size 8 \ --grad_acc 1 \ --valid_batch_size 4 \ --seq_len 512 \ --model_card gpt2.md \ --init_checkpoint ./pretrained_checkpoints/gpt2-medium-pytorch_model.bin \ --platform local \ --clip 0.0 \ --lr 0.0002 \ --weight_decay 0.01 \ --correct_bias \ --adam_beta2 0.999 \ --scheduler linear \ --warmup_step 2000 \ --max_epoch 20 \ --eval_interval 5000 \ --save_interval 5000 \ --lora_dim 4 \ --lora_alpha 32 \ --lora_dropout 0.1 \ --label_smooth 0.1 \ --work_dir ./trained_models/GPT2_M/e2e/adapter_adamix \ --random_seed 110 \ --adamix_only 1 \ --n_experts 8 \ --share_A 0 \ --share_B 1LoRA with Adamixpython -m torch.distributed.launch --nproc_per_node=1 src/gpt2_ft.py \ --train_data ./data/e2e/train.jsonl \ --valid_data ./data/e2e/valid.jsonl \ --train_batch_size 8 \ --grad_acc 1 \ --valid_batch_size 4 \ --seq_len 512 \ --model_card gpt2.md \ --init_checkpoint ./pretrained_checkpoints/gpt2-medium-pytorch_model.bin \ --platform local \ --clip 0.0 \ --lr 0.0002 \ --weight_decay 0.01 \ --correct_bias \ --adam_beta2 0.999 \ --scheduler linear \ --warmup_step 2000 \ --max_epoch 20 \ --eval_interval 5000 \ --save_interval 5000 \ --lora_dim 4 \ --lora_alpha 32 \ --lora_dropout 0.1 \ --label_smooth 0.1 \ --work_dir ./trained_models/GPT2_M/e2e/lora_adamix \ --random_seed 110 \ --n_experts 8 \ --share_A 0 \ --share_B 1Generate outputs from the trained model using beam search (LoRA with Adamix):python -m torch.distributed.launch --nproc_per_node=1 src/gpt2_beam.py \ --data ./data/e2e/test.jsonl \ --batch_size 1 \ --seq_len 128 \ --eval_len 64 \ --model_card gpt2.md \ --init_checkpoint ./trained_models/GPT2_M/e2e/lora_adamix/model.final.pt \ --platform local \ --lora_dim 4 \ --lora_alpha 32 \ --beam 10 \ --length_penalty 0.8 \ --no_repeat_ngram_size 4 \ --repetition_penalty 1.0 \ --eos_token_id 628 \ --work_dir ./trained_models/GPT2_M/e2e/lora_adamix \ --output_file predict.jsonl \ --n_experts 8 \ --share_A 0 \ --share_B 1Decode outputs from step (2)python src/gpt2_decode.py \ --vocab ./vocab \ --sample_file ./trained_models/GPT2_M/e2e/lora_adamix/predict.jsonl \ --input_file ./data/e2e/test_formatted.jsonl \ --output_ref_file e2e_ref.txt \ --output_pred_file e2e_pred.txtRun evaluation on E2E test setpython eval/e2e/measure_scores.py e2e_ref.txt e2e_pred.txt -pReplicating Our Result on WebNLGFollow steps 1 and 2 from E2E pipeline by replacing references to E2E with webnlg (see our paper for hyperparameters)Decode outputs from beam search (step 2 above)python src/gpt2_decode.py \ --vocab ./vocab \ --sample_file ./trained_models/GPT2_M/webnlg/lora_adamix/predict.jsonl \ --input_file ./data/webnlg_challenge_2017/test_formatted.jsonl \ --ref_type webnlg \ --ref_num 6 \ --output_ref_file eval/GenerationEval/data/references_webnlg \ --output_pred_file eval/GenerationEval/data/hypothesis_webnlg \ --tokenize --lowerRun evaluation on WebNLG test setcd ./eval/GenerationEval/ python eval.py \ -R data/references_webnlg/reference \ -H data/hypothesis_webnlg \ -nr 6 \ -m bleu,meteor,ter cd ../..Replicating Our Result on DARTFollow steps 1 and 2 from E2E pipeline by replacing references to E2E with dart (see our paper for hyperparameters)Decode outputs from beam search (step 2 above)python src/gpt2_decode.py \ --vocab ./vocab \ --sample_file ./trained_models/GPT2_M/dart/lora_adamix/predict.jsonl \ --input_file ./data/dart/test_formatted.jsonl \ --ref_type dart \ --ref_num 6 \ --output_ref_file eval/GenerationEval/data/references_dart \ --output_pred_file eval/GenerationEval/data/hypothesis_dart \ --tokenize --lowerRun evaluation on Dart test setcd ./eval/GenerationEval/ python eval.py \ -R data/references_dart/reference \ -H data/hypothesis_dart \ -nr 6 \ -m bleu,meteor,ter cd ../..
adamlint
zulintzulint is a lightweight linting framework designed for complex applications using a mix of third-party linters and custom rules.Why zulintModern full-stack web applications generally involve code written in several programming languages, each of which have their own standard linter tools. For example,Zulipuses Python (mypy/pyflake/pycodestyle), JavaScript (eslint), CSS (stylelint), puppet (puppet-lint), shell (shellcheck), and several more. For many codebases, this results in linting being an unpleasantly slow experience, resulting in even more unpleasant secondary problems like developers merging code that doesn't pass lint, not enforcing linter rules, and debates about whether a useful linter is "worth the time".Zulint is the linter framework we built for Zulip to create a reliable, lightning-fast linter experience to solve these problems. It has the following features:Integrates withgitto only checks files in source control (not automatically generated, untracked, or .gitignore files).Runs the linters in parallel, so you only have to wait for the slowest linter. For Zulip, this is a ~4x performance improvement over running our third-party linters in series.Produduces easy-to-read, clear terminal output, with each independent linter given its own color.Can check just modified files, or even as apre-commithook, only checking files that have changed (and only starting linters which check files that have changed).Handles all the annoying details of flushing stdout and managing color codes.Highly configurable.Integrate a third-party linter with just a couple lines of code.Every feature supports convenient include/exclude rules.Add custom lint rules with a powerful regular expression framework. E.g. in Zulip, we want all access toMessageobjects in views code to be done via ouraccess_message_by_idfunctions (which do security checks to ensure the user the request is being done on behalf of has access to the message), and that is enforced in part by custom regular expression lint rules. This system is optimized Python: Zulip has a few hundred custom linter rules of this type.Easily add custom options to check subsets of your codebase, subsets of rules, etc.Has a nice automated testing framework for custom lint rules, so you can make sure your rules actually work.This codebase has been in production use in Zulip for several years, but only in 2019 was generalized for use by other projects. Its API to be beta and may change (with notice in the release notes) if we discover a better API, and patches to further extend it for more use cases are encouraged.Using adamlintOnce a project is setup with zulint, you'll have a top-level linter script with at least the following options:$ ./example-lint --help usage: example-lint [-h] [--modified] [--verbose-timing] [--skip SKIP] [--only ONLY] [--list] [--list-groups] [--groups GROUPS] [--verbose] [--fix] [targets [targets ...]] positional arguments: targets Specify directories to check optional arguments: -h, --help show this help message and exit --modified, -m Only check modified files --verbose-timing, -vt Print verbose timing output --skip SKIP Specify linters to skip, eg: --skip=mypy,gitlint --only ONLY Specify linters to run, eg: --only=mypy,gitlint --list, -l List all the registered linters --list-groups, -lg List all the registered linter groups --groups GROUPS, -g GROUPS Only run linter for languages in the group(s), e.g.: --groups=backend,frontend --verbose, -v Print verbose output where available --fix Automatically fix problems where supportedpre-commit hook modeSeehttps://github.com/adambirds/xkcd-password-gen/blob/master/tools/pre-commitfor an example pre-commit hook.DevelopmentRun the following commands in a terminal to install zulint.git clone [email protected]:adambirds/adamlint.git python3 -m venv venv source venv/bin/activate python3 setup.py install
adam-lr-decay
Adam Layer-wise LR DecayInELECTRA, which had been published by Stanford University and Google Brain, they had used Layerwise LR Decay technique for the Adam optimizer to prevent Catastrophic forgetting of Pre-trained model.This repo contains the implementation of Layer-wise LR Decay for Adam, with new Optimizer API that had been proposed in TensorFlow 2.11.UsageInstallations:$pipinstalladam-lr-decay# this method does not install tensorflowFor CPU:$pipinstalladam-lr-decay[cpu]# this method installs tensorflow-cpu>=2.11For GPU:$pipinstalladam-lr-decay[gpu]# this method installs tensorflow>=2.11fromtensorflow.kerasimportlayers,modelsfromadam_lr_decayimportAdamLRDecay# ... prepare training data# model definitionmodel=models.Sequential([layers.Dense(3,input_shape=(2,),name='hidden_dense'),layers.Dense(1,name='output')])# optimizer definition with layerwise lr decayadam=AdamLRDecay(learning_rate=1e-3)adam.apply_layerwise_lr_decay(var_name_dicts={'hidden_dense':0.1,'output':0.})# this config decays the key layers by the value,# which is (lr * (1. - decay_rate))# compile the modelmodel.compile(optimizer=adam)# ... training loopIn officialELECTRA repo, they have defined the decay rate in the code. The adapted version is as follows:importcollectionsfromadam_lr_decayimportAdamLRDecaydef_get_layer_lrs(layer_decay,n_layers):key_to_depths=collections.OrderedDict({'/embeddings/':0,'/embeddings_project/':0,'task_specific/':n_layers+2,})forlayerinrange(n_layers):key_to_depths['encoder/layer_'+str(layer)+'/']=layer+1return{key:1.-(layer_decay**(n_layers+2-depth))forkey,depthinkey_to_depths.items()}# ... ELECTRA model definitionadam=AdamLRDecay(learning_rate=1e-3)adam.apply_layerwise_lr_decay(var_name_dicts=_get_layer_lrs(0.9,8))# ... custom training loopThe generated decay rate must be looked like this.0.0means there is no decay and1.0means it is zero learning rate. (non-trainable){"/embeddings/":0.6513215599,"/embeddings_project/":0.6513215599,"task_specific/":0.0,"encoder/layer_0/":0.6125795109999999,"encoder/layer_1/":0.5695327899999999,"encoder/layer_2/":0.5217030999999999,"encoder/layer_3/":0.46855899999999995,"encoder/layer_4/":0.40950999999999993,"encoder/layer_5/":0.3439,"encoder/layer_6/":0.2709999999999999,"encoder/layer_7/":0.18999999999999995}Citation@article{clark2020electra,title={Electra: Pre-training text encoders as discriminators rather than generators},author={Clark, Kevin and Luong, Minh-Thang and Le, Quoc V and Manning, Christopher D},journal={arXiv preprint arXiv:2003.10555},year={2020}}
adam-modbus
adam-modbusstateless asyncio python interface for Advantech 6317 and 6052 modules.
adamo_calibrator
UNKNOWN
adamod
# AdaModAn optimizer which exerts adaptive momental upper bounds on individual learning rates to prevent them becoming undesirably lager than what the historical statistics suggest and avoid the non-convergence issue, thus to a better performance. Strong empirical results on many deep learning applications demonstrate the effectiveness of our proposed method especially on complex networks such as DenseNet and Transformer.<p align=’center’><img src=’img/Loss.bmp’ width=”100%”/></p>## InstallationAdaMod requires Python 3.6.0 or later.### Installing via pipThe preferred way to install AdaMod is viapipwith a virtual environment. Just run`bash pip install adamod `in your Python environment and you are ready to go!### Using source codeAs AdaMod is a Python class with only 100+ lines, an alternative way is directly downloading [adamod.py](./adamod/adamod.py) and copying it to your project.## UsageYou can use AdaMod just like any other PyTorch optimizers.`python3 optimizer =adamod.AdaMod(model.parameters(),lr=1e-3,beta3=0.999) `As described in the paper, AdaMod can smooths out unexpected large learning rates throughout the training process. Thebeta3parameter is the smoothing coefficient for actual learning rate, which controls the average range. In common cases, abeta3in{0.999,0.9999}can achieve relatively good and stable results. See the paper for more details.## DemosFor the full list of demos, please refer to [this page](./demos).## Contributors[@luoruixuan](https://github.com/luoruixuan)
adamp
Slowing Down the Weight Norm Increase in Momentum-based OptimizersOfficial PyTorch implementation of AdamP and SGDP optimizers |Paper|Project pageByeongho Heo*, Sanghyuk Chun*, Seong Joon Oh, Dongyoon Han, Sangdoo Yun, Youngjung Uh, Jung-Woo Ha.* indicates equal contributionClova AI Research, NAVER Corp.AbstractNormalization techniques, such as batch normalization (BN), have led to significant improvements in deep neural network performances. Prior studies have analyzed the benefits of the resulting scale invariance of the weights for the gradient descent (GD) optimizers: it leads to a stabilized training due to the auto-tuning of step sizes. However, we show that, combined with the momentum-based algorithms, the scale invariance tends to induce an excessive growth of the weight norms. This in turn overly suppresses the effective step sizes during training, potentially leading to sub-optimal performances in deep neural networks. We analyze this phenomenon both theoretically and empirically. We propose a simple and effective solution: at each iteration of momentum-based GD optimizers (e.g., SGD or Adam) applied on scale-invariant weights (e.g., Conv weights preceding a BN layer), we remove the radial component (i.e., parallel to the weight vector) from the update vector. Intuitively, this operation prevents the unnecessary update along the radial direction that only increases the weight norm without contributing to the loss minimization. We verify that the modified optimizers SGDP and AdamP successfully regularize the norm growth and improve the performance of a broad set of models. Our experiments cover tasks including image classification and retrieval, object detection, robustness benchmarks, and audio classification.How does it work?Please visit ourproject page.UpdatesJun 19, 2020: built-in cosine similarity and fix warning (v0.3.0)Jun 19, 2020: nesterov update (v0.2.0)Jun 15, 2020: Initial upload (v0.1.0)Getting StartedInstallationpip3 install adampUsageUsage is exactly same astorch.optimlibrary!fromadampimportAdamP# define your paramsoptimizer=AdamP(params,lr=0.001,betas=(0.9,0.999),weight_decay=1e-2)fromadampimportSGDP# define your paramsoptimizer=SGDP(params,lr=0.1,weight_decay=1e-5,momentum=0.9,nesterov=True)ArgumentsSGDPandAdamPshare arguments withtorch.optim.SGDandtorch.optim.Adam. There are two additional hyperparameters; we recommend using the default values.delta: threhold that determines whether a set of parameters is scale invariant or not (default: 0.1)wd_ratio: relative weight decay applied onscale-invariantparameters compared to that applied onscale-variantparameters (default: 0.1)BothSGDPandAdamPsupport Nesterov momentum.nesterov: enables Nesterov momentum (default: False)LicenseThis project is distributed under MIT license.Copyright (c) 2020-present NAVER Corp. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.How to cite@article{heo2020adamp, title={Slowing Down the Weight Norm Increase in Momentum-based Optimizers}, author={Heo, Byeongho and Chun, Sanghyuk and Oh, Seong Joon and Han, Dongyoon and Yun, Sangdoo and Uh, Youngjung and Ha, Jung-Woo}, year={2020}, journal={arXiv preprint arXiv:2006.08217}, }
adamP_BioTools
No description available on PyPI.
adampdf
Failed to fetch description. HTTP Status Code: 404
adampdf2
This is the homepage of our project.
adampy
Documentation for AdampyDescriptionAdampy allows to retrieve, analyze and download data hosted within the ADAM environment.Installation Procedurevirtualenv -p `which python3` venv source venv/bin/activate python3 -m pip install --upgrade pip pip install adampyFunctionsgetCollectionsThe getCollections function returns all available collections in the selected endpoint.adam.getCollections(endpoint).get_data()Parametersendpoint (str) - The name of the endpoint to get the collections from.ReturnsList with name of all collectionsExamplesTo get the list of collections:importadampyasadamcollections=adam.getCollections('wcs-eo4sdcr.adamplatform.eu').get_data()print(collections)getImageThe getImage function returns a numpy array containing the requested image. The image can be saved using Rasterio.adam.getImage(endpoint, collection, time_t, min_lat = -90, max_lat = 90, min_long = -180, max_long = 180, token = 'None', geometry = 'None', masking = False, fname = 'image.tif').get_data()Parametersendpoint (str) - The name of the endpoint to get the collections from.collection (str) - The name of the collectiontime_t (str) - The time or time range in the format yyyy-mm-ddThh:mm:ssmin_lat (int or float; optional) - Minimum latitude of the bounding box (range -90 to 90)max_lat (int or float; optional) - Maximum latitude of the bounding box (range -90 to 90)min_long (int or float; optional) - Minimum longitude of the bounding box (range -180 to 180)max_long (int or float; optional) - Maximum longitude of the bounding box (range -180 to 180)token (str; optional) - Token to access restricted collectionsgeometry (shp, geojson or kml file; optional) - Geometry to mask the output imagemasking (True or False; Default False ; optional) - Activate the masking optionfname (str; optional) - Name for the output file, if not stated fname = image.tifReturnsNumpy array with the requested image and Metadata information for the imageExamplesGet a global image for a particular timeimportadampyasadamimportmatplotlibimportmatplotlib.pyplotaspltimage,out_meta=adam.getImage('wcs-eo4sdcr.adamplatform.eu','Z_CAMS_C_ECMF_PM10_4326_04','2019-03-26T00:00:00').get_data()plt.subplots(figsize=(13,13))plt.imshow(image)Get a bounding box for a particular timeimportadampyasadamimportmatplotlibimportmatplotlib.pyplotaspltimage,out_meta=adam.getImage('wcs-eo4sdcr.adamplatform.eu','Z_CAMS_C_ECMF_PM10_4326_04','2019-03-26T00:00:00',10,20,-10,50).get_data()plt.subplots(figsize=(13,13))plt.imshow(image)Get a bounding box for a time rangeimportadampyasadamimportmatplotlibimportmatplotlib.pyplotaspltimage,out_meta=adam.getImage('wcs-eo4sdcr.adamplatform.eu','Z_CAMS_C_ECMF_PM10_4326_04','2019-03-26T00:00:00,2019-03-27T23:59:59',10,20,-10,50).get_data()plt.subplots(figsize=(13,13))plt.imshow(image)Get a masked image for a time rangeimportadampyasadamimportmatplotlibimportmatplotlib.pyplotaspltimage,out_meta=adam.getImage('wcs-eo4sdcr.adamplatform.eu','Z_CAMS_C_ECMF_PM10_4326_04','2019-03-26T00:00:00,2019-03-27T23:59:59',geometry='polygon.shp',masking=True).get_data()plt.subplots(figsize=(13,13))plt.imshow(image)getTimeSeriesThe getTimeSeries function returns two arrays containing the values and time stamps for the request Latitude and Longitude location.adam.getTimeSeries(endpoint, collection, time_t, lat, long, token = 'None').get_data()Parametersendpoint (str) - The name of the endpoint to get the collections from.collection (str) - The name of the collectiontime_t (str) - The time or time range in the format yyyy-mm-ddThh:mm:sslat (int or float; optional) - Minimum latitude of the bounding box (range -90 to 90)long (int or float; optional) - Minimum longitude of the bounding box (range -180 to 180)token (str; optional) - Token to access restricted collectionsReturnsTwo arrays containing the values and time stamps for the request Latitude and Longitude locationExamplesimportadampyasadamdata,times=adam.getTimeSeries('wcs-eo4sdcr.adamplatform.eu','ERA-Interim_temp2m_4326_05','2014-03-26T00:00:00,2014-03-30T23:59:59',25,60).get_data()getAnimationThe getAnimation function crates an animated gif of a dataset given a start and end date.adam.getTimeSeries(endpoint, collection, start_date, end_date, min_lat = -90, max_lat = 90, min_long = -180, max_long = 180, token = 'None', frame_duration = 0.1, legend = False).get_data()Parametersendpoint (str) - The name of the endpoint to get the collections from.collection (str) - The name of the collectionstart_date (date object) - The start date of the animationend_date (date object) - The end date of the animationmin_lat (int or float; optional) - Minimum latitude of the bounding box (range -90 to 90)max_lat (int or float; optional) - Maximum latitude of the bounding box (range -90 to 90)min_long (int or float; optional) - Minimum longitude of the bounding box (range -180 to 180)max_long (int or float; optional) - Maximum longitude of the bounding box (range -180 to 180)token (str; optional) - Token to access restricted collectionsframe_duration (float or int; optional) - Frame duration in secondslegend (True or False; optional) - Add legend to the animationReturnsAn animated GIF of the dataset for a given start and end date.Examplesimportadampyasadamfromdatetimeimportdatetime,timedelta,datestart_date=date(2014,3,1)end_date=date(2014,3,5)gif_fname=adam.getAnimation('wcs-eo4sdcr.adamplatform.eu','NEXGDDP-pr_4326_025',start_date=start_date,end_date=end_date,frame_duration=0.3,legend=False).get_data()
adamr
AdamRAdam with weight Recovery optimizerTL;DRAdamW tends to decay parameters towards zero, which makes the model "forget" the pretrained parameters during finetuning. Instead, AdamWR tries to recover parameters towards pretrained values during finetuning.Have a tryJust like other PyTorch optimizers,from adamr import AdamR from xxx import SomeModel, SomeData, SomeDevice, SomeLoss model = SomeModel() dataloader = SomeData() model.to(SomeDevice) adamr = AdamR( model.parameters(), lr=1e-5, betas=(0.9, 0.998), # Adam's beta parameters eps=1e-8, weight_recovery=0.1 ) loss_fn = SomeLoss() for x, y in dataloader: adamwr.zero_grad() y_bar = model(x) loss = loss_fn(y_bar, y) loss.backward() adamr.step()AlgorithmTODO: improve the readabilityHere is a paper snippet:
adam-robotics
adamAutomatic Differentiation for rigid-body-dynamics AlgorithMsadamimplements a collection of algorithms for calculating rigid-body dynamics forfloating-baserobots, inmixedandbody fixed representations(seeTraversaro's A Unified View of the Equations of Motion used for Control Design of Humanoid Robots) using:JaxCasADiPyTorchNumPyadamemploys theautomatic differentiationcapabilities of these frameworks to compute, if needed, gradients, Jacobian, Hessians of rigid-body dynamics quantities. This approach enables the design of optimal control and reinforcement learning strategies in robotics.adamis based on Roy Featherstone's Rigid Body Dynamics Algorithms.⚠️ REPOSITORY UNDER DEVELOPMENT ⚠️We cannot guarantee stable API🐍 Dependenciespython3Other requisites are:urdf_parser_pyjaxcasadipytorchnumpyThey will be installed in the installation step!💾 InstallationThe installation can be done either using the Python provided by apt (on Debian-based distros) or via conda (on Linux and macOS).🐍 Installation with pipInstallpython3, if not installed (inUbuntu 20.04):sudoaptinstallpython3.8Create avirtual environment, if you prefer. For example:pipinstallvirtualenv python3-mvenvyour_virtual_envsourceyour_virtual_env/bin/activateInside the virtual environment, install the library from pip:InstallJaxinterface:pipinstalladam-robotics[jax]InstallCasADiinterface:pipinstalladam-robotics[casadi]InstallPyTorchinterface:pipinstalladam-robotics[pytorch]InstallALLinterfaces:pipinstalladam-robotics[all]If you want the last version:pipinstalladam-robotics[selected-interface]@git+https://github.com/ami-iit/ADAMor clone the repo and install:gitclonehttps://github.com/ami-iit/adam.gitcdadam pipinstall.[selected-interface]📦 Installation with condaInstallation from conda-forge packagemambacreate-nadamenv-cconda-forgeadam-roboticsIf you want to usejaxorpytorch, just install the corresponding package as well.🔨 Installation from repoInstall in a conda environment the required dependencies:Jaxinterface dependencies:mambacreate-nadamenv-cconda-forgejaxnumpylxmlprettytablematplotliburdfdom-pyCasADiinterface dependencies:mambacreate-nadamenv-cconda-forgecasadinumpylxmlprettytablematplotliburdfdom-pyPyTorchinterface dependencies:mambacreate-nadamenv-cconda-forgepytorchnumpylxmlprettytablematplotliburdfdom-pyALLinterfaces dependencies:mambacreate-nadamenv-cconda-forgejaxcasadipytorchnumpylxmlprettytablematplotliburdfdom-pyActivate the environment, clone the repo and install the library:mambaactivateadamenv gitclonehttps://github.com/ami-iit/ADAM.gitcdadam pipinstall--no-deps.🚀 UsageThe following are small snippets of the use ofadam. More examples are arriving! Have also a look at tetestsfolder.Jax interfaceimportadamfromadam.jaximportKinDynComputationsimporticub_modelsimportnumpyasnp# if you want to icub-models https://github.com/robotology/icub-models to retrieve the urdfmodel_path=icub_models.get_model_file("iCubGazeboV2_5")# The joint listjoints_name_list=['torso_pitch','torso_roll','torso_yaw','l_shoulder_pitch','l_shoulder_roll','l_shoulder_yaw','l_elbow','r_shoulder_pitch','r_shoulder_roll','r_shoulder_yaw','r_elbow','l_hip_pitch','l_hip_roll','l_hip_yaw','l_knee','l_ankle_pitch','l_ankle_roll','r_hip_pitch','r_hip_roll','r_hip_yaw','r_knee','r_ankle_pitch','r_ankle_roll']# Specify the root linkroot_link='root_link'kinDyn=KinDynComputations(model_path,joints_name_list,root_link)# choose the representation, if you want to use the body fixed representationkinDyn.set_frame_velocity_representation(adam.Representations.BODY_FIXED_REPRESENTATION)# or, if you want to use the mixed representation (that is the default)kinDyn.set_frame_velocity_representation(adam.Representations.MIXED_REPRESENTATION)w_H_b=np.eye(4)joints=np.ones(len(joints_name_list))M=kinDyn.mass_matrix(w_H_b,joints)print(M)CasADi interfaceimportadamfromadam.casadiimportKinDynComputationsimporticub_modelsimportnumpyasnp# if you want to icub-models https://github.com/robotology/icub-models to retrieve the urdfmodel_path=icub_models.get_model_file("iCubGazeboV2_5")# The joint listjoints_name_list=['torso_pitch','torso_roll','torso_yaw','l_shoulder_pitch','l_shoulder_roll','l_shoulder_yaw','l_elbow','r_shoulder_pitch','r_shoulder_roll','r_shoulder_yaw','r_elbow','l_hip_pitch','l_hip_roll','l_hip_yaw','l_knee','l_ankle_pitch','l_ankle_roll','r_hip_pitch','r_hip_roll','r_hip_yaw','r_knee','r_ankle_pitch','r_ankle_roll']# Specify the root linkroot_link='root_link'kinDyn=KinDynComputations(model_path,joints_name_list,root_link)# choose the representation you want to use the body fixed representationkinDyn.set_frame_velocity_representation(adam.Representations.BODY_FIXED_REPRESENTATION)# or, if you want to use the mixed representation (that is the default)kinDyn.set_frame_velocity_representation(adam.Representations.MIXED_REPRESENTATION)w_H_b=np.eye(4)joints=np.ones(len(joints_name_list))M=kinDyn.mass_matrix_fun()print(M(w_H_b,joints))PyTorch interfaceimportadamfromadam.pytorchimportKinDynComputationsimporticub_modelsimportnumpyasnp# if you want to icub-models https://github.com/robotology/icub-models to retrieve the urdfmodel_path=icub_models.get_model_file("iCubGazeboV2_5")# The joint listjoints_name_list=['torso_pitch','torso_roll','torso_yaw','l_shoulder_pitch','l_shoulder_roll','l_shoulder_yaw','l_elbow','r_shoulder_pitch','r_shoulder_roll','r_shoulder_yaw','r_elbow','l_hip_pitch','l_hip_roll','l_hip_yaw','l_knee','l_ankle_pitch','l_ankle_roll','r_hip_pitch','r_hip_roll','r_hip_yaw','r_knee','r_ankle_pitch','r_ankle_roll']# Specify the root linkroot_link='root_link'kinDyn=KinDynComputations(model_path,joints_name_list,root_link)# choose the representation you want to use the body fixed representationkinDyn.set_frame_velocity_representation(adam.Representations.BODY_FIXED_REPRESENTATION)# or, if you want to use the mixed representation (that is the default)kinDyn.set_frame_velocity_representation(adam.Representations.MIXED_REPRESENTATION)w_H_b=np.eye(4)joints=np.ones(len(joints_name_list))M=kinDyn.mass_matrix(w_H_b,joints)print(M)🦸‍♂️ Contributingadamis an open-source project. Contributions are very welcome!Open an issue with your feature request or if you spot a bug. Then, you can also proceed with a Pull-requests! :rocket:TodoCenter of Mass positionJacobiansForward kinematicsMass Matrix via CRBACentroidal Momentum Matrix via CRBARecursive Newton-Euler algorithm (still no acceleration in the algorithm, since it is used only for the computation of the bias force)Articulated Body algorithm
adams
ADAMS: Align Distance Matrix with SIFT algorithm enables GPU-Accelerated protein structre comparisonRequirementsopencv == 4.7.0.72numpy >= 1.17.2cuda > 11.xcupy-cuda111 == 12.2.0 or same as cuda versionbiopython == 1.81scipy == 1.11.2tqdm == 4.66.1cuda == 11.x or same as cupy versionpickleInstallationA pypi package coming soon. Python source code is available abovePlease contact:[email protected] more informationTutorial and descriptionIntroductionWe've developed a method to address the issue of numerous proteins exhibiting high structural similarity despite having no sequence similarities. This problem has become increasingly critical as Alphafold2 continues to predict new structures, resulting in a massive database (23TiB ver 4) that lacks an effective data mining tool.Foldseek offers a solution by embedding local structure into the sequence and transforming this issue into a sequence alignment problem. It's significantly faster than DALI, TM-Align, and CE-Align and outperforms them on structure comparison benchmarks.However, according to the Foldseek paper, we observed that Foldseek occasionally underperforms compared to DALI, indicating that some 'overall information' may not be captured within local structure embedding.Our Align Distance Matrix with SIFT algorithm (ADAMS) is similar to DALI but uses an enhanced version of the renowned computer vision algorithm - Scale Invariant Feature Transform (SIFT). It extracts key features from protein distance matrices at different scales and compares their similarities. Most calculations can benefit from GPU acceleration. This zero-shot model enables more precise structure comparisons at speeds comparable to Foldseek-TM tools. Users can create their own pdb databases on PCs for all-vs-all comparisons with increased speed and reduced memory usage (approximately 500MB - 3GB GPU memory for a 20000 all vs all comparison).The algorithm is illustrated in Fig.1: The original SIFT algorithm is applied on distance matrixes to extract detectable features across various scales. These features are represented as 128-dimension vectors which are then stacked into an n X 128 matrix for comparison between two structures using cosine similarity calculated between two feature matrices by A X B.T operation. Given these features have nearly identical lengths (512 ± 1.5), feature distances are determined by angles rather than length differences between them; thus when normalized beforehand, similarity calculation becomes straightforward on GPUs.The performance metrics are as follows - it took between 3-4 seconds to search for the protein structure 'OSM-3' (699aa) within a C.elegans protein structure database (19361 structures) using an Nvidia RTX2080Ti (11GiB) GPU. When loading the entire database onto the dataset, total GPU memory usage was around 4000MB. However, when loaded separately, it only consumed about 500MB of memory. Importantly, these different methods did not impact search speed.pre-print paper is here:https://www.biorxiv.org/content/10.1101/2023.11.14.566990v1.article-metricsTutorialInstallationpip install adams1. Download a pdb set and make it a cuda_database or a compatible one'import adams from adams.toolkit import * from adams.db_maker import * db = DatabaseMaker(device=0, process=40) # use GPU-0,40*1.5 process. db.make('./pdb','./pdb_db') # put your pdb dataset in one folder and make your database in another one2. Match your protein structure to different databasesimport adams from adams.tool_kit import * from adams.matcher import ADAMS_match matcher = ADAMS_match('./protein.pdb',gpu_usage=[0,1],threshold=0.95)#use gpu0 and gpu1 result = matcher.match('./pdb_db','tmp') # search similar protein structure from a database, return a pandas dataframe. A temp folder is needed, will be created if not exist.Firstly check the compare_all.py script:compare_all.pyif permission deniedchmod +x path/to/compare_all.py
adam-sdk
Название проекта: AdamSDK - Разработка программного обеспечения для управления роботом АдамОписание проекта:AdamSDK - это инновационный проект, направленный на разработку мощного и гибкого программного обеспечения для управления роботом Адам. Робот Адам представляет собой передовую платформу, способную выполнять широкий спектр задач в различных областях, включая промышленность, здравоохранение, образование и многое другое.AdamSDK разрабатывается с целью обеспечить максимальную гибкость и простоту в использовании для разработчиков и конечных пользователей. С помощью AdamSDK разработчики получают доступ к широкому набору инструментов, библиотек и API, которые позволяют им создавать и настраивать функциональность робота Адам в соответствии с требованиями конкретного применения.Основные возможности и функции AdamSDK включают:Управление движением: AdamSDK предоставляет программные интерфейсы для управления движениями робота Адам. Разработчики могут программировать перемещение робота, его скорость, повороты и другие аспекты, необходимые для выполнения задач.Восприятие окружающей среды: AdamSDK позволяет роботу Адам взаимодействовать с окружающей средой. Он обладает различными датчиками, такими как камеры, микрофоны, глаза и так далее, что позволяет роботу распознавать объекты, лица, звуки и принимать решения на основе полученных данных.Интеграция с другими системами: AdamSDK поддерживает интеграцию с другими программными и аппаратными системами. Разработчики могут использовать SDK для интеграции робота Адам с существующими системами управления, базами данных или другими устройствами, что обеспечивает ещё больше возможностей для реализации различных задач.Расширяемость и настраиваемость: AdamSDK разработан с упором на расширяемость и настраиваемость функциональности. Разработчики могут создавать собственные модули и плагины, чтобы добавлять новые возможности и функции робота Адам в соответствии с уникальными потребностями и требованиями проекта.Поддержка разработчиков: AdamSDK предоставляет обширную документацию, примеры кода и руководства по использованию. Разработчики получают поддержку от команды разработчиков AdamSDK, которая готова помочь в решении проблем, предоставить консультации и помощь в интеграции SDK с проектами.AdamSDK создан с целью сделать робота Адам доступным и удобным для разработчиков и пользователей, чтобы они могли воплотить свои идеи и реализовать сложные задачи с помощью передовой робототехники. Этот проект открывает двери для инноваций и прогресса в различных отраслях и способствует развитию современных решений, основанных на робототехнике.