Archive for Juni 26th, 2024

Deutscher Bundesminister für Wirtschaft und Energie Dr. Robert Habeck (Grüne) – beim Tag der Industrie 2024

Mittwoch, Juni 26th, 2024

Das Unfallkrankenhaus in Berlin Marzahn (ukb) – aus welchen Gründen ein Gesundheitscampus direkt auf dem Klinikgelände so bedeutsam ist

Mittwoch, Juni 26th, 2024

Hoymiles Micro Storage (MS) – the easy install solar battery storage solution built for balconies

Mittwoch, Juni 26th, 2024

AVM FRITZ!OS 7.90-114112 Labor – für die FRITZ!Box 7590 AX veröffentlicht und erfolgreich installiert

Mittwoch, Juni 26th, 2024

Proxmox Virtual Environment (VE) 8.2.4 – how to use your first local ‚Meta Llama 3‘ Large Language Model (LLM) project without the need for a GPU

Mittwoch, Juni 26th, 2024

root@pve-ai-llm-01:~# ollama run llama3

>>> how to use kalman filter with microsoft power bi
What a great question!
While the Kalman filter is a powerful algorithm for estimating uncertain states from
noisy measurements, it’s not directly supported by Microsoft Power BI. However, I’ll
provide some guidance on how you can use the Kalman filter in Power BI, albeit
indirectly.
**Option 1: Use R Scripting**
Power BI supports R scripting through its `R Script` visualization. You can write an R
script that implements the Kalman filter using a library like `kalmanFilter`
(available on CRAN) and then deploy it to Power BI.
Here’s a high-level example of how you might use this approach:
1. Create a new R script in Power BI.
2. Install the `kalmanFilter` package using the `install.packages(„kalmanFilter“)`
command.
3. Implement your Kalman filter algorithm using the `kalmanFilter` library and
relevant data from your dataset (e.g., measurements, process noise covariance).
4. Use the R script to estimate the state of interest (e.g., position, velocity) based
on the noisy measurements.
5. Visualize the estimated state using a Power BI visualization (e.g., line chart,
scatter plot).
**Option 2: Use Python Scripting**
Power BI also supports Python scripting through its `Python Script` visualization. You
can use a library like `scipy` and implement the Kalman filter algorithm in Python.
Here’s a high-level example of how you might use this approach:
1. Create a new Python script in Power BI.
2. Install the `scipy` library using the `import scipy` command.
3. Implement your Kalman filter algorithm using the `scipy.linalg.kalman_filter`
function and relevant data from your dataset (e.g., measurements, process noise
covariance).
4. Use the Python script to estimate the state of interest (e.g., position, velocity)
based on the noisy measurements.
5. Visualize the estimated state using a Power BI visualization (e.g., line chart,
scatter plot).
**Additional Tips**
When implementing the Kalman filter in either R or Python, keep in mind:
* Make sure to handle any necessary data preprocessing and manipulation before
applying the Kalman filter.
* Use relevant libraries and functions for handling matrices and linear algebra
operations.
* Be mindful of the computational complexity and optimize your code as needed.
While these options allow you to use the Kalman filter with Power BI, they do require
some programming expertise. If you’re not familiar with R or Python, you might need to
invest time in learning the basics before implementing the Kalman filter.
Remember that the Kalman filter is a powerful algorithm for state estimation, but it’s
just one tool among many for analyzing and visualizing data. Power BI provides a wide
range of built-in visualizations and capabilities that can help you create insightful
and engaging reports without needing to implement the Kalman filter directly.
>>> Send a message (/? for help)
>>> /bye
root@pve-ai-llm-01:~#

Mittwoch, Juni 26th, 2024

Microsoft Power BI – three must have visuals

Mittwoch, Juni 26th, 2024

Proxmox Virtual Environment (VE) 8.2.4 – how to use your first local ‚Meta Llama 3‘ Large Language Model (LLM) project without the need for a GPU

Mittwoch, Juni 26th, 2024

root@pve-ai-llm-01:~# ollama run llama3

>>> how to log a odbc connection with tcpdump
A clever question!
To log an ODBC (Open Database Connectivity) connection using `tcpdump`, you’ll need to capture the TCP packets that are exchanged between your application
and the database server. Here’s a step-by-step guide:
**Prerequisites:**
1. You have `tcpdump` installed on your system.
2. Your application uses ODBC to connect to the database server.
3. The database server listens for incoming connections on a specific TCP port (e.g., 5432 for PostgreSQL or 3306 for MySQL).
**Capturing ODBC connection packets:**
1. Identify the TCP ports used by your application and the database server. For example, if your application uses ODBC to connect to a PostgreSQL database
listening on port 5432, you’ll want to capture packets on that port.
2. Run `tcpdump` with the following options:
„`
tcpdump -n -vv -s 0 -c 1000 -W 1024 -i any port 5432
„`
Here’s what each option does:
* `-n`: Don’t resolve hostnames (use IP addresses instead).
* `-vv`: Increase verbosity for more detailed output.
* `-s 0`: Capture packets of all sizes, including small ones.
* `-c 1000`: Limit the capture to 1000 packets.
* `-W 1024`: Set the maximum packet size to 1024 bytes (adjust this value if you need to capture larger packets).
* `-i any`: Capture traffic on any interface (e.g., `eth0`, `wlan0`, etc.).
* `port 5432`: Specify the port number to capture traffic for (in this case, PostgreSQL’s default port).
**Capturing ODBC connection establishment:**
To capture the exact moment when an ODBC connection is established, you can use a filter to capture only the initial handshake packets.
1. Run `tcpdump` with the following options:
„`
tcpdump -n -vv -s 0 -c 1000 -W 1024 -i any port 5432 ‚tcp src port 32773 and tcp dst port 5432‘
„`
Here’s what this filter does:
* `tcp`: Capture TCP packets.
* `src port 32773`: Only capture packets with the source port set to 32773 (a common ODBC port).
* `dst port 5432`: Only capture packets with the destination port set to 5432 (the PostgreSQL port).
**Analyzing captured packets:**
Once you’ve captured the ODBC connection packets, you can use tools like `tcpdump` or `Wireshark` to analyze them. You can filter the captured packets by
protocol (e.g., TCP), port numbers, or specific packet contents.
In Wireshark, for example, you can:
1. Open the capture file.
2. Apply a filter using the „Apply a display filter“ option.
3. Use the `tcp` protocol filter to see only TCP packets.
4. Look for packets with the source port set to 32773 (ODBC) and destination port set to 5432 (PostgreSQL).
By analyzing these packets, you should be able to see the ODBC connection establishment process, including the initial handshake, authentication, and data
transmission.
Keep in mind that capturing network traffic can reveal sensitive information about your system and database. Make sure to use `tcpdump` or other capture
tools responsibly and with caution.
>>> Send a message (/? for help)
>>> /bye
root@pve-ai-llm-01:~#

St. Willibrord Spital Emmerich – Emmericher kämpfen um ihr insolventes Krankenhaus

Mittwoch, Juni 26th, 2024