Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
Getting started with the spark core
Search
claudiomettler
January 14, 2015
Technology
1
280
Getting started with the spark core
For Hackware v0.3
claudiomettler
January 14, 2015
Tweet
Share
More Decks by claudiomettler
See All by claudiomettler
On-demand image scaling with AWS Lambda and S3
claudiomettler
0
150
Terraform in 5 minutes
claudiomettler
0
730
Intro to Xdebug
claudiomettler
0
210
chef talk at DevOps Singapore
claudiomettler
0
140
Other Decks in Technology
See All in Technology
TypeScript と歩む OpenAPI の discriminator / OpenAPI discriminator with TypeScript
kaminashi
1
150
FastMCPでSQLをチェックしてくれるMCPサーバーを自作してCursorから動かしてみた
nayuts
1
220
toittaにOpenTelemetryを導入した話 / Mackerel APM リリースパーティ
cohalz
1
490
AIコードエディタは開発を変えるか?Cursorをチームに導入して1ヶ月経った本音
ota1022
1
710
Redmineの意外と知らない便利機能 (Redmine 6.0対応版)
vividtone
0
1.2k
Houtou.pm #1
papix
0
670
[zh-TW] DevOpsDays Taipei 2025 -- Creating Awesome Change in SmartNews!(machine translation)
martin_lover
1
650
新卒から4年間、20年もののWebサービスと向き合って学んだソフトウェア考古学 - PHPカンファレンス新潟2025 / new graduate 4year software archeology
oguri
2
360
ITエンジニアを取り巻く環境とキャリアパス / A career path for Japanese IT engineers
takatama
4
1.5k
All About Sansan – for New Global Engineers
sansan33
PRO
1
1.2k
GoogleのAI Agent
shukob
0
140
プロジェクトマネジメント実践論|現役エンジニアが語る!~チームでモノづくりをする時のコツとは?~
mixi_engineers
PRO
3
180
Featured
See All Featured
Building Adaptive Systems
keathley
41
2.6k
個人開発の失敗を避けるイケてる考え方 / tips for indie hackers
panda_program
106
19k
Testing 201, or: Great Expectations
jmmastey
42
7.5k
How GitHub (no longer) Works
holman
314
140k
GraphQLとの向き合い方2022年版
quramy
46
14k
Understanding Cognitive Biases in Performance Measurement
bluesmoon
29
1.7k
KATA
mclloyd
29
14k
The Pragmatic Product Professional
lauravandoore
35
6.7k
How to Think Like a Performance Engineer
csswizardry
23
1.6k
No one is an island. Learnings from fostering a developers community.
thoeni
21
3.3k
Imperfection Machines: The Place of Print at Facebook
scottboms
267
13k
Designing Dashboards & Data Visualisations in Web Apps
destraynor
231
53k
Transcript
getting started with the spark core 1
2
hardware • 72MHz ARM Cortex M3 • TI CC3000 WiFi
• 7 analog IO (pwm out), 7 digital IO • ~ S$ 55 • coming soon: Photon, 120MHz, Broadcom Wifi, half the price 3
setup, the easy and unreliable way • install smartphone app
• power up spark • use smartphone app to create cloud account and register the core 4
setup, the better way (USB) npm install spark-cli spark setup
5
ready? • your spark core should have a name and
be breathing cyan by now 6
the tinker firmware • default firmware of spark core •
allows access to all inputs/outputs 7
let’s ask the cloud export SPARK_CORE_ID=53ff71065075535128311587 export SPARK_ACCESS_TOKEN=[get this from
https://spark.io/ build or “spark login”] 8
let’s ask the cloud curl -H "Authorization: Bearer $SPARK_ACCESS_TOKEN” https://
api.spark.io/v1/devices/$SPARK_CORE_ID/ Alternatively include the token in the URL: https://api.spark.io/v1/devices/$SPARK_CORE_ID/?access_token= $SPARK_ACCESS_TOKEN 9
output { "id": "53ff71065075535128311587", "name": "schnitzel", "connected": true, "variables": {},
"functions": [ "digitalread", "digitalwrite", "analogread", "analogwrite" ], "cc3000_patch_version": "1.29" 10
using functions curl https://api.spark.io/v1/devices/$SPARK_CORE_ID/ digitalwrite?access_token=$SPARK_ACCESS_TOKEN -d params=D7,LOW or include access_token
in POST data: curl https://api.spark.io/v1/devices/$SPARK_CORE_ID/ digitalwrite -d access_token=$SPARK_ACCESS_TOKEN -d params=D7,LOW 11
response { "id": "53ff71065075535128311587", "name": "schnitzel", "last_app": null, "connected": true,
"return_value": 1 } 12
the cloud API • events, functions, variables • http://docs.spark.io/api/ 13
the javascript API • runs in browser and in nodejs
• (npm|bower) install spark • http://docs.spark.io/javascript/ 14
voodoospark • alternative firmware • faster communication through local TCP
connection • npm client module • still uses cloud for facilitating local connection 15
DIY, noob mode 16
DIY, noob mode #2 17
DIY spark compile myapp.ino spark flash firmware_1421160254713.bin 18
DIY, hardcore mode brew tap PX4/homebrew-px4 brew update brew install
gcc-arm-none-eabi-48 brew install dfu-util git clone https://github.com/spark/core-firmware.git git clone https://github.com/spark/core-common-lib.git git clone https://github.com/spark/core-communication-lib.git 19
DIY, hardcore mode 20
DIY, hardcore mode cd core-firmware/build make 21
flash via USB • press&hold mode button, tap and release
RST button until LED flashes yellow • spark core is now in DFU mode 22
flash via USB make program-dfu 23
flash via cloud make program-cloud 24
create your own app mkdir core-firmware/applications/myapp vim core-firmware/applications/myapp/application.cpp cd core-firmware/build
make APP=myapp make APP=myapp program-dfu # or program-cloud 25
app structure #include "application.h" int example(String command); void setup(){ //
set up cloud functions & variables, IO pin modes Spark.function("example", example); pinMode(D7, OUTPUT); } void loop(){ // called continuously } int example(String command){ digitalWrite(D7, HIGH); delay(2000); digitalWrite(D7, LOW); return 1; } 26
app structure { "id": "53ff71065075535128311587", "name": "schnitzel", "connected": true, "variables":
{}, "functions": [ "example" ], "cc3000_patch_version": "1.29" 27
documentation • http://docs.spark.io/firmware/ 28
my current project 29
my current project • websockets • static site hosted on
S3 • direct connection from browser to spark • https://github.com/ponyfleisch/cloudlamp 30