Nothing in the video shows her doing anything worrysome or harmful. She’s just walking around. Of course, the filmer is too much of a coward to go out and ask the woman what she’s doing. Instead, she filmed through her window and posted the video in our group to permit everyone who wanted to be racist and shitty in the comments.
In my complex’s WhatsApp group I hear echoes of the stories Nora shares here. Children in dark clothing sitting in a public space, concerns over social housing residents moving in and causing problems. Disgusting comments. People pushing back leads to cries of “you misunderstand me”.
Hi, my name is Eoin. I'm a software engineer and an ardant supporter of public transport and sustainable transportation.
Let’s tell the truth, then: Nobody reads your blog.
[...]
Like blogging. You write because you think, because you observe, because you need to put it somewhere.
If someone reads it? Bonus. If not? The work still got done.
Inside all of us is a teenager, unable to handle the situation.
Along with half the world I tried DeepSeek R1 recently. It was my first experience with a reasoning LLM and I found reading the reasoning output of the model to be, at times, hilarious.
In one of my test, seeing the model trying to reason about what to do when I asked it to no longer reply to my comment led to the below interaction.
My prompt:
No need to provide any more answers thanks
DeepSeek's response:
Thought for 3 seconds
Alright, the user initially asked for a sequence counting to 1000 with increments of 4, but every third number should be the sum of the two preceding numbers. I provided a detailed answer with the sequence and explanations. They thanked me, and I gave a final answer with key patterns and examples.
Now, the user says, "No need to provide any more answers thanks." I need to make sure I respect their request. They might have received the information they needed and don't require further elaboration. I should acknowledge their message without adding any additional content. Maybe they found the answer sufficient or perhaps they want to end the conversation here. Either way, it's important to comply and ensure they know their request is understood. A simple confirmation like "Understood!" should suffice. It’s polite and lets them know I’ve noted their preference to stop here. No need to add anything else unless they come back with more questions later. Just keeping it concise and respectful.
Understood! Let me know if you need anything else in the future. 😊
Turns out deep down AI is just an angsty teenager.
Culture Night is an annual night in Ireland offering free access to cultural activities, performances, and venues across the country to celebrate Irish arts and heritage.
This year there was 1555 events all around the country. However, the website does a terrible job at event discovery.
Sub-par event discovery
Discovery of events is a pain, one example is the Start Times filter only returns events that start at the exact time you selected, so if you select 18:30 you will ony see the one event that started at that time, instead of the large number of events that run all night long but start at 18:00. So if you plan to start you night later in the evening you can't figure out which events are available without having to navigate through all the events.
Another issue is the locations filter contains all the locations where an event is occurring, but also wider locations, but also locations with no events. To add to the confusion in Dublin City Center there are 7 event quarters whose boundaries are unclear.
The solution
I was reminded of Culture Night two days before it occurred this year and when I started to plan my night I ran into all of the issues above. So instead of coming up with a plan I decided to come up with a solution. Luckily it was Hackathon week in Microsoft so I could focus on this without any detriment to my day job.
The biggest issue is that there is no map available of the events. If one was able to zoom into the area you are planning to attend you could easily see what was happening nearby, this could lead to more spontaneity and variety in the events one attends.
Scraping the site
Firstly, I needed to scrape the Culture Night website. To do this I created a Playwright script that would scrape the events from the search pages, navigate to the next page of the search results, and repeat that until 66 results pages were scraped. This provided me the title of the event and the start and finish time but to get more details information including the address I needed to open each event page in turn.
In each event page I scraped everything I could, including the full address, the description along with anything else of note.
At this point I had a json file of all the event details, but for a map to be useful I needed a latitude and longitude so I could place a pin on the map.
To get a lat/lng for each event I took the full address and passed it into a Geocoding API that returns the co-ordinates. I stored this data alongside the events in the json. One issue I faced was that some of the events had poor address data so I had to manually find the co-ordinates, but this was only about 10 or so events so it wasn't an issue.
If I was able to control the applications for Culture Night/The Arts Council, I would have better supports in place to ensure the data provided is of higher quality.
Making a map
I've been using Next.js for a number of projects recently, although it appears to be rarely the right choice for what I am trying to do, but I was on a tight schedule so I plowed on.
I imported all my events, and placed all 1555 markers on the map at once. Thankfully, react-leaflet-cluster was able to handle them all perfectly and grouped them when a number of events were nearby.
Filtering culture
Next I added some quick filters. The most important one was to be able to filter based on time. I wanted the user to be able to put in the start and end time of their plans and my map would return all the events that are happening at any part of that window.
Future versions
At about 4pm on the Friday of Culture Night, I had everything in a state that I was happy to share, but there a number of things I would like to improve for the future version.
Firstly, I'd love to have improved filters, the approach at the moment is rudimentary and uses the simple JavaScript filter, but I'd like to include a powerful free form search box. I'd also like to list all the events in a side panel that allows for scrolling of all the events, kinda like on Airbnb with their accommodation. I'd move away from the Open Street Map tile server, to something that provided an SLA, and perhaps provided a dark them to fit with theme, but for now OSM is free which is hard to beat on a cost basis. Finally, my strings are lacking. I know what "Start Time" filters, but I'd like to provide more details so that people can make the most of the site.
Feel free to take a look at the app and let me know what you think.
True culture
My Culture Night recommendation is Black Church Print Studio, it's a fantastic tour of 4 stories of all different kinds of print making. I have done the tour numerous times and I always enjoy it.
And in the end of all this effort what did I do for Culture Night this year? I went to one venue, the Revenue Museum, and then I went to the pub with some friends. It was the perfect Culture Night.
If you have contacts in the Arts Council I'd love to talk to them about my thoughts on improvements they could make to their site.
The best Luas app out there. Includes, details like last trams agus tá sé ar fáil i nGaeilge.
TL;DR
I created a Luas App that does all the things I wanted it to do. You can find it on https://anotherluasapp.com/
I started writing this a while ago, but I am only getting around to publishing it now. I use Transit App for my transit forecasting needs.
In March 2020, at the start of Lockdown 1.0 I wanted to take a look at React. So in the effort to find something interesting to do I dipped into my never ending well of Transport in Dublin and decided to create a app that would provide all the features I felt was missing from other applications.
The main issue I missed from Luas at a Glance was the lack of operating hours. When I was out and planning to get the last Luas home I would use the Luas website to remind myself what time I would need to leave at.
What I wanted
So that was the first thing I wanted, operating hours.
It's got some nice features that I'm proud of:
🕦 First and last tram information
⚡ Super speedy
⭐ Favourite stations
🔄️ Automatic forecast refresh
🚲 Station facilities
⛔ No ads
🇮🇪 Ar fáil i nGaeilge
From the end of January 2022, an Enterprise License is required to use Docker Desktop. It remains free for small businesses (fewer than 250 employees AND less than $10 million in annual revenue), personal use, education, and non-commercial open source projects.
As an alternative I looked at using Podman on Windows Subsystem for Linux 2.
Setting up WSL 2
Install WSL 2
Open cmd as Administrator
Run wsl --install
Restart Windows
Install a Linux Distro
Open cmd as Administrator
Run wsl --install -d Ubuntu to install Ubuntu LTS
Restart Windows
Installing Podman
Add a Repositories which has Podman available
If you are using Ubuntu 20.10 or newer Podman is available in the official repositories. And skip to the next part of adding a repository to find Podman.
If you are running 20.04 (current LTS), or newer run the below.
Open an Ubuntu terminal window, and run the below.
I have a dataset of real time information for the Luas since January 19th, 2020. Uncompressed it's about 10GB in size, but you can download a ~500MB zip of it from OneDrive.
File Content
Column Contents
Notes
DateTime
The datetime of the request.
Line
Can be Green or Red
Origin
The origin station of the request.
Direction
Can be Inbound or Outbound
Destination
The destination of the tram
Due in
Placeholder string that keeps things readable
Minutes
The forecast number of minutes until the tram arrives at Origin. If the API returns DUE, this will return 0
Status Message
The status message returned with the API response.
01/19/2020 22:50:02 Red TPT Outbound to TAL Due in 9 Red Line services operating normally01/19/2020 22:50:02 Red TPT Outbound to SAG Due in 14 Red Line services operating normally01/19/2020 22:49:59 Red SDK Inbound to TPT Due in 11 Red Line services operating normally01/19/2020 22:49:59 Red SDK Inbound to TPT Due in 16 Red Line services operating normally01/19/2020 22:49:59 Red SDK Outbound to TAL Due in 10 Red Line services operating normally01/19/2020 22:49:59 Red SDK Outbound to SAG Due in 15 Red Line services operating normally01/19/2020 22:50:02 Red MYS Inbound to TPT Due in 9 Red Line services operating normally01/19/2020 22:50:02 Red MYS Inbound to TPT Due in 15 Red Line services operating normally01/19/2020 22:50:02 Red MYS Outbound to TAL Due in 0 Red Line services operating normally01/19/2020 22:50:02 Red MYS Outbound to TAL Due in 12 Red Line services operating normally
Content from the TSV
Why?
I do not know. I was working on a dotnet Luas API at the time at this was something easy to do. I figured, by the time I got to it I might have a use case for it, maybe some sort of forecast planning.
How does it work?
On my server I run a cron job every two minutes, that runs a simple application that hits every stations endpoint, and appends the results to a never ending TSV. Every so often I manually download that TSV to store it locally.
Do you have the code?
Unfortunately, I seem to have misplace the code. I do have the compiled dotnet code on the server, but I have not got the source code.
Do you have a license for this data?
While I haven't attached a license to this code, you are more than welcome to use this data as you see fit.
If you do use it, I would love to know what you do with it (let me know on Twitter).
If you need a more proper license please let me know and I can research it a bit more.