<?xml version="1.0" encoding="UTF-8"?>
<feed xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns="http://www.w3.org/2005/Atom">
<title>Článek ve sborníku</title>
<link href="http://hdl.handle.net/10563/1001717" rel="alternate"/>
<subtitle/>
<id>http://hdl.handle.net/10563/1001717</id>
<updated>2026-04-06T13:13:07Z</updated>
<dc:date>2026-04-06T13:13:07Z</dc:date>
<entry>
<title>Statistical evaluation of metal plated polymer surfaces</title>
<link href="http://hdl.handle.net/10563/1012769" rel="alternate"/>
<author>
<name>Kubišová, Milena</name>
</author>
<author>
<name>Knedlová, Jana</name>
</author>
<author>
<name>Endlerová, Dagmar</name>
</author>
<author>
<name>Ludrovcová, Barbora</name>
</author>
<id>http://hdl.handle.net/10563/1012769</id>
<updated>2026-02-19T10:08:27Z</updated>
<published>2025-01-01T00:00:00Z</published>
<summary type="text">Statistical evaluation of metal plated polymer surfaces
Kubišová, Milena; Knedlová, Jana; Endlerová, Dagmar; Ludrovcová, Barbora
The surface quality of metallized polymer parts plays a crucial role in the functional and aesthetic properties of the final product, especially in the automotive industry and consumer electronics. This study focuses on non-contact measurement of the surface structure of ABS polymer parts, metallized using vacuum technology with the application of copper, nickel and chromium layers. The measurement was performed on the Talysurf CLI 500 device and evaluated according to the current standards ČSN EN ISO 21920-1 and 21920-2, which define the profile roughness parameters and the methodology for their assessment. The measured data were subsequently analyzed using descriptive statistics and multivariate statistical methods including analysis of variance (ANOVA), which allowed to reveal differences between samples manufactured in different time periods and in different plating conditions. The results confirm that the metal coating significantly affects the values of key surface structure parameters (Ra, Rz), and indicate the need for consistent control of process stability. The study also confirms the benefits of statistical tools for quality assessment and supports their wider use in technological practice. © Published under licence by IOP Publishing Ltd.
</summary>
<dc:date>2025-01-01T00:00:00Z</dc:date>
</entry>
<entry>
<title>Analysis of the laser beam trajectory and its impact on the cut in PMMA material</title>
<link href="http://hdl.handle.net/10563/1012768" rel="alternate"/>
<author>
<name>Knedlová, Jana</name>
</author>
<author>
<name>Bílek, Ondřej</name>
</author>
<author>
<name>Bartík, David</name>
</author>
<id>http://hdl.handle.net/10563/1012768</id>
<updated>2026-02-19T10:08:27Z</updated>
<published>2025-01-01T00:00:00Z</published>
<summary type="text">Analysis of the laser beam trajectory and its impact on the cut in PMMA material
Knedlová, Jana; Bílek, Ondřej; Bartík, David
Article investigates how laser toolpath interpolation (linear vs. circular) affects kerf width, dimensional accuracy, and kerf taper when CO&lt;inf&gt;2&lt;/inf&gt;-cutting PMMA sheets (t = 3, 5 mm) on an ILS 3NM system (λ = 10.6 μm; P = 100 W). Lenses with f = 1.5″ and 2.5″ were assessed. At identical process settings adopted from prior work, linear toolpaths yielded larger outer dimensions and more pronounced kerf taper, whereas circular paths produced more stable kerf and smaller taper angles. In all cases, the top-surface kerf exceeded the bottom-surface kerf. Thinner sheets (3 mm) were more sensitive to focal length and heat accumulation. The results provide practical guidance for selecting optics and toolpaths to balance accuracy and edge quality in PMMA cutting and form an experimental basis for future AI-assisted process optimization. © Published under licence by IOP Publishing Ltd.
</summary>
<dc:date>2025-01-01T00:00:00Z</dc:date>
</entry>
<entry>
<title>SD-LSTM: A novel semi–decentralized LSTM architecture for scalable and accurate stock price prediction</title>
<link href="http://hdl.handle.net/10563/1012728" rel="alternate"/>
<author>
<name>Li, Peng</name>
</author>
<author>
<name>Šenkeřík, Roman</name>
</author>
<author>
<name>Komínková Oplatková, Zuzana</name>
</author>
<id>http://hdl.handle.net/10563/1012728</id>
<updated>2026-03-26T13:13:50Z</updated>
<published>2026-01-01T00:00:00Z</published>
<summary type="text">SD-LSTM: A novel semi–decentralized LSTM architecture for scalable and accurate stock price prediction
Li, Peng; Šenkeřík, Roman; Komínková Oplatková, Zuzana
This study introduces a novel Semi-Decentralized Long Short-Term Memory (SD-LSTM) architecture and compares its performance against a traditional LSTM model for stock price prediction, examining both accuracy and training time. All experiments employ canonical settings. Results indicate that SD-LSTM consistently achieves better prediction accuracy—evidenced by significantly lower mean squared error—across stock data from 5 major U.S. companies (Apple, NVIDIA, Amazon, Alphabet, Microsoft). Moreover, SD-LSTM accomplishes these improvements with fewer parameters. In terms of training speed, SD-LSTM is substantially faster than traditional LSTM when handling larger datasets and more complex configurations, highlighting its efficiency in parallel processing. Overall, these findings underscore the potential of this new SD-LSTM architecture for large-scale applications and its viability for integration into both established and emerging hybrid approaches that demand advanced predictive accuracy and computational efficiency.
</summary>
<dc:date>2026-01-01T00:00:00Z</dc:date>
</entry>
<entry>
<title>Evaluating NLP tools for AI in software requirements analysis</title>
<link href="http://hdl.handle.net/10563/1012726" rel="alternate"/>
<author>
<name>Okechukwu, Cornelius Chimuanya</name>
</author>
<author>
<name>Šilhavý, Radek</name>
</author>
<author>
<name>Šilhavý, Petr</name>
</author>
<id>http://hdl.handle.net/10563/1012726</id>
<updated>2026-02-17T12:10:05Z</updated>
<published>2025-01-01T00:00:00Z</published>
<summary type="text">Evaluating NLP tools for AI in software requirements analysis
Okechukwu, Cornelius Chimuanya; Šilhavý, Radek; Šilhavý, Petr
Software requirements analysis is increasingly automated by applying natural language processing (NLP) tools, enhancing efficiency and precision. This research employs the Mendeley FR_NFR dataset to evaluate the classification of functional requirements (FR) and non-functional requirements (NFR) utilising three NLP tools: NLTK, OpenAI, and spaCy. The evaluation uses performance indicators like F1-score, recall, accuracy, precision, and confusion matrices. OpenAI is a good option for high-stakes applications because of its 94% F1 score and exceptional accuracy, even with the related API expenses. With 83% accuracy and 0.1 s per query, SpaCy is ideal for real-time applications because it balances speed and efficiency. With its 68% accuracy rate, NLTK’s rule-based methodology is still a viable choice for prototyping or in controlled settings where transparency is crucial. With an average accuracy of 92%, the results show that OpenAI’s transformer-based model performs better than NLTK and spaCy, even though spaCy has an advantage in entity recognition. This study provides practitioners with critical insights by elucidating the trade-offs between accuracy, interpretability, and computational efficiency.
</summary>
<dc:date>2025-01-01T00:00:00Z</dc:date>
</entry>
</feed>
