Page Not Found
Page not found. Your pixels are in another canvas.
A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.
Page not found. Your pixels are in another canvas.
About me
This is a page not in th emain menu
Published:
You might have already seen my blog on OpenStreetMap from 2020. In that post, I briefly talked about Overpass API server with a pointer to a GitHub repo to setting up locally. However, that setup might fail if we try to build for entire planet. Additionally, source repository hasn’t been updated for 5 years. I’ve received some requests to assist in setting up the Overpass server. Therefore, this short post basically illustrates the process of local Overpass server.
Currently, the process has been very simple and straightforward, thanks to this docker image docker image.
This approach is beneficial when our region of interest in small (compared to entire world).
docker run -e OVERPASS_META=yes -e OVERPASS_MODE=init -e OVERPASS_PLANET_URL=http://download.geofabrik.de/europe/monaco-latest.osm.bz2 -e OVERPASS_DIFF_URL=http://download.openstreetmap.fr/replication/europe/monaco/minute/ -e OVERPASS_RULES_LOAD=10 -v /overpass_db/:/db/overpass_clone_db -p 8888:80 -it --name overpass_monaco wiktorn/overpass-api
This usually takes 5 minutes on a normal computer including the downloading image from dockerhub, downloading OSM file from geofabrik and building the database for a 700 KB file. All the generated builds can be used with the server. At the end of this build, docker container will be stopped and need to be started again. For the same, either one can assign a name to container in the previous step or can look at auto-generated name with docker ps --all
command. After grabbing the name, simply start the container as docker start <CONTAINER NAME>
. Alternatively, one can pass -e OVERPASS_STOP_AFTER_INIT false
option so that we can continue the instance after flushing database.
With that, we can query for a pizza shop: -.
In the above, we didn’t use a custom region as shown here obtained from OSM tool without tweaks. But, workaround should be simple, i.e. either providing file as file:///
as mention here or hosting file with local HTTP server.
Note: So far, I am not able to circumvent the issue when I try to mount a local folder in current directory. Although build was successful however during the query server results in error. This is specific to windows only.
When we need to scale up to entire world, then cloning is a better option compared to building from the raw OSM file. In this case, we can pass the option in OVERPASS_MODE
for clone
. The data will be cloned from the Overpass API server with the defined replication. You can check out available replication frequency on the OSM wiki.
docker run -e OVERPASS_MODE=clone -e OVERPASS_DIFF_URL=https://planet.openstreetmap.org/replication/day/ -v /big/docker/overpass_clone_db/:/db -p 8888:80 -it --name overpass_world wiktorn/overpass-api
This process took approx 2 hours with a good internet speed on a Linux machine, and it took approx 204 GB. For sure, this number will grow with more contributions. We can now query for nearby Indian restaurants from our POI with query: -
I have a sample REST API with Flask which basically finds the nearest toilets from the query point. To use the above server is easy, we just need to change the URL of Overpass API server mentioned here to self.sever = 'http://localhost:8888/api/interpreter'
.
Reach out to me on Instagram for a faster reply!
Published:
I have been working with various neural networks for a while and always find recurrent neural network (RNN) very special. Whenever, there is a dependency with previous data points, then these networks shines out. Time-series problem especially fits here for e.g. predicting the weather pattern or stock market. In the past, I have done work to compare various RNNs for such tasks and my work concluded that ESN works quite well compared to simple deep neural network and some RNNs like LSTM or GRU (paper-1, paper-2 and paper-3) (of course this statement vary with the domain). Those works were done mostly with the scientific data for turbulent flow for thermal plumes, which has significant effects on weather (Wikipedia). Now, I was curious to see how these network can work for practical purpose.
Published:
This is rather a short documentation to run the TensorFlow jobs on a high performance computing (HPC) cluster which is using LSF Load Sharing Facility. This can be extrapolated to other HPC systems with some tweaks. If you’re new to LSF and HPC jobs with DL then this post nicely summerizes the jargon.
Published:
[Updated on 05.08.2022]
Firebase is a Backend-as-a-Service (BaaS) app development platform backed by Google. It provides a variety of services like database, cloud storage etc. Most prominent use case (atleast for me) is having a realtime database in almost no time to enable quick prototyping. Recently, I’ve experimented with Python and Swift (iOS) SDKs for Firebase and this documentation summerizes the same.
The intention behind adding the continous stream of data to Firebase is inspired from Internet of Thing (IoT) sensor where data is keep coming, and based upon the data we might want to trigger some action. There are few post already available like this, this and this. As the most simplest case, my target is to aquire some timeseries data on a given interval and send this to Firebase continously. As no external sensor is attached to my system, so I decide to get my sysetm information like RAM usage, CPU usage etc. on a given frequency and then add this to our database.
Creating a Firebase account and setting up database is straightforwad with google id. Here, we will use Firebase Realtime Database.
Build>Realtime Database
. Then click on Create a Database and select geographical location accordingly.+
, we can add the data. However, we will do this via Python.Rules
by clicking on that tab and set them to true
. It will allow anyone to read/write data! For more details, see the official docsfirebaseConfig
part dictionary (i.e. content inside the {})and save it as a txt file in our python projetc folder locally as credential.txt
.pip install pyrebase4
. For me, normal pyrebase
was throwing an error so this was the easiest solution, I found at that time.credential.txt
in the same folder.import pyrebase
import psutil
import platform
import time
import shutil
import datetime
def get_device_name():
dn = platform.node()
#dn = ("").join([i.replace('-','') for i in dn])
return dn
def get_timestamp():
ts = time.time()
return datetime.datetime.fromtimestamp(ts).strftime('%d%m%YT%H%M%S')
def sys_info():
disc_usage = shutil.disk_usage("/")
info = {}
# info["ts"] = get_timestamp()
info["ram_usage"] = psutil.virtual_memory()[2]
info["cpu_usage"] = psutil.cpu_percent()
info["disk_usage"] = disc_usage[1]/disc_usage[0]*100
return info
def read_cred(filename:str)->dict:
d = {}
with open(filename) as f:
for line in f:
(key, val) = line.split(': "')
d[key] = val.split('"')[0]
return d
def initialize_firebase():
config = read_cred(filename = "credential.txt")
firebase = pyrebase.initialize_app(config)
#_auth = firebase.auth()
return firebase
def add_data_to_firebase(db):
data = sys_info()
#resp = db.push(data)
resp = db.child(get_device_name()).child(get_timestamp()).set(data)
return resp
def print_all_data_in_db():
users = db.child().get()
print(users.val())
# %%
if __name__ == "__main__":
db = initialize_firebase().database()
ctr = 0
res = []
while ctr < 3:
print("Instance: ", ctr)
res.append(add_data_to_firebase(db))
time.sleep(2)
ctr+=1
print("Done!")
This part will make a similar attempt with Firebase RT database using the Swift programming language while targetting the iOS development. Steps are similar and intutive as above, but for the sake of completeness:
GoogleService-Info.plist
) then drag and drop in Xcode’s file navigator as shown in the official SDK documentation.podfile
for Firebase. My podfile
looks as following. If you don’t have podfile
then download cocoapod and initialize with pod init
. With pod install
, dependencies will install. Remember, its a big file therefore, it will take some time. Also, I needed to close my Xcode otherwise keep getting some errors.# Uncomment the next line to define a global platform for your project
platform :ios, '12.0'
target 'ObjectDetection' do
# Comment the next line if you're not using Swift and don't want to use dynamic frameworks
use_frameworks!
# Pods for ObjectDetection
pod 'TensorFlowLiteSwift'
# Pods for firebase
pod 'FirebaseAuth'
pod 'FirebaseFirestore'
pod 'FirebaseDatabase'
end
AppDelegate.swift
, do the following (again suggested in setup process in Firebase):import UIKit
import FirebaseCore
@UIApplicationMain
class AppDelegate: UIResponder, UIApplicationDelegate {
var window: UIWindow?
func application(_ application: UIApplication,
didFinishLaunchingWithOptions launchOptions:
[UIApplicationLaunchOptionsKey: Any]?) -> Bool {
FirebaseApp.configure()
return true
}
}
import FirebaseDatabase
func writeToFirebase(outputClass:String, outputClassScore: Float){
let dateString = "SampleDateAndTime"
let locationData = "SampleLocation"
let ref = Database.database().reference().child("deviceID/\(deviceID)").child("\(dateString)")
ref.updateChildValues(["location":locationData,
"outputClass":outputClass,
"outputClassScore": outputClassScore])
}
// Get the deviceId
let deviceID = UIDevice.current.identifierForVendor!.uuidString
// Then we call this function with the arguments
writeToFirebase(outputClass:"myOutputClass", outputClassScore:1.0)
This will then write the desired results to the Firebase. For me the use-case was to deploy a ML model on iPhone and then set the relavant data to the Firebase, so that I can query and build relavant dashboard.
Explore and examine the following and update the documentation:
Reach out to me on Instagram for a faster reply!
Published:
Published:
[Updated on 14.09.2022] [Updated on 04.07.2022]
Published:
This is a (growing) general-purpose list (for my own documentation purpose) of some of the resources which I have found very helpful.
Published:
Published:
Published:
**05.06.2020 - Discontinued permanently due to the unavailability of direct data API. **
Published:
Short description of portfolio item number 1
Published:
Short description of portfolio item number 2
Published in , 2019
Here you can find a list of conference proceedings.
Published in , 2020
This link contains the full list of publications in various journals.
Published:
This is a description of your talk, which is a markdown files that can be all markdown-ified like any other post. Yay markdown!
Published:
This is a description of your conference proceedings talk, note the different field in type. You can put anything in this field.
Published:
November 2018: (Our work on machine learning and DSN was featured by GCS).[https://www.hlrs.de/whats-new/news/archive/detail-view/2018-11-05/]